Those who forget history are doomed to be dogmatic software developers.
A lot of stuff we take for granted are either accidents of history, or powerful counter-reactions to the accidents of history.
There is a practice, and it turns out to be bad. Mild discussion of the virtues and vices would, in a world composed of Asimovian robots, be sufficient to update the practice to something better.
But that's not how humans work! Typically an existing practice is only overturned by the loudest, most persuasive, most energetic voices. And they have to be. Humans don't come to the middle by being shown the middle. They come to the middle after being shown the other fringe.
So a generation changes its mind and moves closer to the new practice. Eventually, that is all the following generation has ever heard of. The original writing transforms its role from mind-shifting advocacy to the undecided to being Holy Writ. The historical context, and with it the chance to understand the middle way that had to be obscured to find the middle way, is lost.
My previous role was as an engineer teaching other engineers an XP-flavoured style of engineering. I often referred to our practices as "dogma", because we are dogmatic. But if we aren't, less learning takes place. Dogma is most instructional when someone later finds its limits.
When I was learning to coach weightlifters, I was told something that has always stuck with me: "As a coach, you will tell trainees a series of increasingly accurate lies". You can't start with nuance. In the beginning, it won't work.
You can start with nuance and openly acknowledge the lie. This doesn't inhibit learning. Every modern textbook on Newtonian physics tells students that it is effectively a "lie", in that it's only an approximation that works reasonably well in most real world cases. But you have to start there before learning general relativity and quantum physics.
You may have more experience than me, but at this point in my life, I find myself disagreeing with this. In fact, one of the biggest problems I had with education is teaching dogmas. On the other hand, when I did teaching, tutoring and lecturing, I always tried to make it clear and explicit that what I'm telling is a practical simplification, that it has limits here and there, but within these particular constraints it's a good approximation. And the feedback I got was always that it made things much clearer to people - people felt it makes sense, because it had context.
What I taught was a way of working. I didn't deviate from the practices, because the principles are easy to state but hard to truly grok.
Going back to what I said earlier, this is the difference between weightlifting drills for various parts of the movement, versus discussions of physiology, anatomy, anthropometry or physics.
Thanks for the clarification. So it's something like, first learn to do something in a decent way, and only then - when you're familiar with the subject matter - start thinking from first principles?
I think the Socratic method is a much better teaching device. Dogma is the opposite of teaching you to think for yourself. Dogma says "Follow these principles and you'll write good code, don't question it!". Principles need to be challenged and proof needs to be provided how and in what way they are really "best practices". The black and white thinking of dogma are the very reasons why many religious groups don't progress with the modern world and insist they know better than everyone else without having to explain why. Not trying to diss all religions, but I was raised in a very strict one and group-think and anecdotal evidence is used to justify poor decision making. I'd rather be a critical-thinking programmer than a principle-obeying one.
Maybe I need to use a different word, especially I used it a little self-deprecatingly. I do answer questions (often at eyeball-dessicating length), but I also insist on the practices.
I sometimes referred to Pivotal Labs as a debating club that produces code as a by-product. Everything is up for debate. "Strong opinions, weakly held" was a frequent motto.
But that didn't mean we started from scratch. Almost all projects start with the core practices and stick with them fairly tenaciously and inflexibly (in the face of the circumstances we have seen before), in order to facilitate the immersion.
Yes. I need a different word. I am not conveying this well at all.
A lot of stuff we take for granted are either accidents of history, or powerful counter-reactions to the accidents of history.
There is a practice, and it turns out to be bad. Mild discussion of the virtues and vices would, in a world composed of Asimovian robots, be sufficient to update the practice to something better.
But that's not how humans work! Typically an existing practice is only overturned by the loudest, most persuasive, most energetic voices. And they have to be. Humans don't come to the middle by being shown the middle. They come to the middle after being shown the other fringe.
So a generation changes its mind and moves closer to the new practice. Eventually, that is all the following generation has ever heard of. The original writing transforms its role from mind-shifting advocacy to the undecided to being Holy Writ. The historical context, and with it the chance to understand the middle way that had to be obscured to find the middle way, is lost.
My previous role was as an engineer teaching other engineers an XP-flavoured style of engineering. I often referred to our practices as "dogma", because we are dogmatic. But if we aren't, less learning takes place. Dogma is most instructional when someone later finds its limits.
When I was learning to coach weightlifters, I was told something that has always stuck with me: "As a coach, you will tell trainees a series of increasingly accurate lies". You can't start with nuance. In the beginning, it won't work.