In episode 60 [1] of the great (and sadly, ending soon) podcast Pragmatic, John Chidgey talks some about the history of programming and more specifically some about Turing, arguing that Turing is overrated. I largely agree, and since I send John feedback after every episode, I sent him a longish rant, reproduced as follows:
When talking about the history of programming, there's always a difficulty because of the difference between computing and programming. It's nearly undeniable, for example, that Vannevar Bush had a greater impact on computing than did John Von Neumann (I'll take the haters on with that). As We May Think is more important than Von Neumann's draft, because what Von Neumann was talking about had already been created and codified, whereas As We May Think was much more of a visionary work. Maybe put another way, I think Von Neumann was a brilliant synthesist, bringing together the ideas of many into the next step in the evolution of whatever he was working on at the time. Bush, on the other hand, was a management/strategic level thinker who saw the revolutionary step ahead of whatever he was working on. But, Bush was not a programmer in any way, and his impact on programming is limited. Similarly, I think that Turing is overly lionized for his programming contributions because of the Turing test and Eliza. He provided a lot of the theoretical underpinnings, but I would argue that C.A.R. Hoare's contributions have a greater reaching practical impact than did Turing's. I suspect that Turing and Von Neumann had political connections in the ACM and IEEE that led to their improved historical standing. But as always, historical/political credit is almost as much a function of who you know/where you are as what you've done. (I'm a huge fan of Shannon, Mauchly, Adm. Hopper, Hoare, and Dijkstra, over others like Turing and Von Neumann).
It's also very difficult, when just surveying computer history, to give proper credit to Zuse, Lebedev, Scherbius, Rejewski, etc, due to the lack of English Language resources on their accomplishments, and lingering bias against the governments some of them worked for. One could make a very compelling argument for Scherbius as having created _the_ most pivotal invention of the twentieth century, because Enigma drove the large investment into cryptanalysis, which led to the large investment in devices that eventually became the general purpose computing machines of today.
There's also a large, long discussion to be had about the impact of figures like Marvin Minsky, Bill Gosper, Richard Greenblatt, Dennis Ritchie, Ken Thompson, Richard Stallman, Linus Torvalds, Brendan Eich, Donald Knuth, Peter Norvig, or Alan Kay had on programming, and also on the impact figures like Jack Goldman, Doug Engelbart, Steve Wozniak, Thomas Watson, Robert Noyce, Gordon Moore, David Packard, Bill Hewlett, Steve Jobs, Nolan Bushnell, Larry Ellison, Bill Gates, Vint Cerf, or Jim Clark had on computing. Few of those guys cross over from programming to computing or vice versa, yet each have critical contributions to be discussed and looked at. I don't know of a lot of good books or resources, though, that really tackle this. Steven Levy's Hackers is the canonical example, but it is heavily biased toward the AI Lab crowds, Lisp hackers, and the early Unix pioneers, without touching on the big industry/engineer types more than tangentially or even scornfully, and almost completely ignored the military/NASA. I also really appreciated Peter Siebel's Coders at Work, which was more inclusive but not really a history, more of a set of conversations. I'm told Petzold's CODE is good, and of course deeper into history there's the Godel, Escher, Bach: the Eternal Golden Braid, which I have on my bookshelf and have to admit not getting to far into because I don't really like math. Either way, as I'm sure you know, there is a wide, wide history of programming and computing that could be explored more, and I find it a shame that no one has done so with the historical rigor that I would like. (Such a book would probably be $100+ because sales would be so small since few people would be interested, and it would discourage others from taking up future projects)
I agree that in terms of practical impact, Turing has had limited impact. And, probably he had political connections related to the war effort. But I don't feel that he is overrated. He was a first class theoretician. Some non-trivial results that I know of are,
1. A version of the central limit theorem [1]
2. A fixed-point combinator in his proof of equivalence of lambda calculus and computable functions, in his paper. Any one who has tried to derive the Y-combinator (appropriate for this website) knows how tricky it is to get one. Turing seemed to have done this all this in an appendix. [2]
3. LU decomposition. [3]
4. I don't understand his work on morphogenesis, but it seems to be his most cited paper. [4]
I largely agree with the list of pioneers that you have pointed out, without berating Turing.
There's always a tendency to focus in on a few "pivotal creators" when the reality is much more diffuse. People like to know who the inventor of X was and the reality is that the enshrined inventor, at best, made a particular advance in commercialization or practicality--or simply got the good press.
Hackers is a good read but it arguably sacrifices historical completeness for narrative flow. You mention the focus on the AI lab but then it also makes the argument everything that happened on the east coast was eclipsed. But then, that's what good stories do. I watched The Imitation Game last night and it took enormous historical liberties--probably too many. Breaking the Code is better in that regard. On the other hand, I saw a historical play about the invention of ether last year that I felt was harmed by too literal attention to less important threads of the central story.
When talking about the history of programming, there's always a difficulty because of the difference between computing and programming. It's nearly undeniable, for example, that Vannevar Bush had a greater impact on computing than did John Von Neumann (I'll take the haters on with that). As We May Think is more important than Von Neumann's draft, because what Von Neumann was talking about had already been created and codified, whereas As We May Think was much more of a visionary work. Maybe put another way, I think Von Neumann was a brilliant synthesist, bringing together the ideas of many into the next step in the evolution of whatever he was working on at the time. Bush, on the other hand, was a management/strategic level thinker who saw the revolutionary step ahead of whatever he was working on. But, Bush was not a programmer in any way, and his impact on programming is limited. Similarly, I think that Turing is overly lionized for his programming contributions because of the Turing test and Eliza. He provided a lot of the theoretical underpinnings, but I would argue that C.A.R. Hoare's contributions have a greater reaching practical impact than did Turing's. I suspect that Turing and Von Neumann had political connections in the ACM and IEEE that led to their improved historical standing. But as always, historical/political credit is almost as much a function of who you know/where you are as what you've done. (I'm a huge fan of Shannon, Mauchly, Adm. Hopper, Hoare, and Dijkstra, over others like Turing and Von Neumann).
It's also very difficult, when just surveying computer history, to give proper credit to Zuse, Lebedev, Scherbius, Rejewski, etc, due to the lack of English Language resources on their accomplishments, and lingering bias against the governments some of them worked for. One could make a very compelling argument for Scherbius as having created _the_ most pivotal invention of the twentieth century, because Enigma drove the large investment into cryptanalysis, which led to the large investment in devices that eventually became the general purpose computing machines of today.
There's also a large, long discussion to be had about the impact of figures like Marvin Minsky, Bill Gosper, Richard Greenblatt, Dennis Ritchie, Ken Thompson, Richard Stallman, Linus Torvalds, Brendan Eich, Donald Knuth, Peter Norvig, or Alan Kay had on programming, and also on the impact figures like Jack Goldman, Doug Engelbart, Steve Wozniak, Thomas Watson, Robert Noyce, Gordon Moore, David Packard, Bill Hewlett, Steve Jobs, Nolan Bushnell, Larry Ellison, Bill Gates, Vint Cerf, or Jim Clark had on computing. Few of those guys cross over from programming to computing or vice versa, yet each have critical contributions to be discussed and looked at. I don't know of a lot of good books or resources, though, that really tackle this. Steven Levy's Hackers is the canonical example, but it is heavily biased toward the AI Lab crowds, Lisp hackers, and the early Unix pioneers, without touching on the big industry/engineer types more than tangentially or even scornfully, and almost completely ignored the military/NASA. I also really appreciated Peter Siebel's Coders at Work, which was more inclusive but not really a history, more of a set of conversations. I'm told Petzold's CODE is good, and of course deeper into history there's the Godel, Escher, Bach: the Eternal Golden Braid, which I have on my bookshelf and have to admit not getting to far into because I don't really like math. Either way, as I'm sure you know, there is a wide, wide history of programming and computing that could be explored more, and I find it a shame that no one has done so with the historical rigor that I would like. (Such a book would probably be $100+ because sales would be so small since few people would be interested, and it would discourage others from taking up future projects)
[1] http://techdistortion.com/podcasts/pragmatic/episode-60-or-w...