> The first autocode and its compiler were developed by Alick Glennie in 1952 for the Mark 1 computer at the University of Manchester and is considered by some to be the first compiled programming language.
which seemingly contradicts the implication in the NTY article that Booker was the principal author of Autocode.
It seems Booker and Glennie were coworkers at Turing's lab. Anyone know more?
EDIT: The Glennie claim is sourced to a 1976 paper coauthored by Donald Knuth, where Knuth says
> The unsung hero of this development was Alick E. Glennie of Fort Halstead, the Royal Armaments Research Establishment. We may justly say "unsung" because it is very difficult to deduce from the published literature that Glennie introduced this system.
There's no contradiction here, because Glennie's first attempt as autocode isn't referred to as "Mark 1 Autocode" (despite running on the Mark 1). Apparently it wasn't an influence on later developments. Wikipedia is pretty clear about this. https://en.wikipedia.org/wiki/Autocode
"Previous to Brooker's work, Alick Glennie who was an MOS external user of the machine had developed his personal automatic coding scheme in the summer of 1952. Glennie's system, which he called autocode, was very machine-dependent and was not released to other users." -- Early computing in Britain, 1948-1958, pg 257
Always stayed away from Intel assembly, but thought I'd have a go at running this. First issue was this is I think, Intel format, whereas I was using GNU as, so I had to make a few changes:
Mainly because the assembler I knew I had on my computer was the GNU assembler and it uses that syntax. I did quickly try the .intel_syntax directive but it was just easier to change it around as the code was so short.
Most of my assembler experience is with 6502 or ARM.
A couple online places to try this and where you can play around with this code. The "%" after msg on the last line seems to cause an error, at least for when I tried it with my version of nasm, so I removed it in the below links.
A small website with a few dozen compilers/assemblers, just press F8 or click "Run it" to try the sample above:
"One programmer was Vera Hewison, whom he married in 1957. (She died in 2018.) Another was Mary Lee Woods, whose son, Tim Berners-Lee, would go on to invent the World Wide Web."
The industry really was (still is I guess) very small.
Indeed. When I was a young CPU designer (working on walk-in, refrigerated, mainframes implemented in 100K-series ECL) several of the old bulls I worked with had been CPU designers in the days of vacuum tubes. I loved hearing the stories of the early days, and considered it a privilege to learn the craft from them.
Modern technology is mainly the result of chemistry and materials sciences. Key events are the discovery of nitrocellulose in the 1840s and the introduction of the Bessemer process of steelmaking in the 1850s.
Nitrocellulose starts the development of guncotton, high explosives, celluloid, photographic film, and all sorts of modern plastics. Cheap steel enables engines, steel hulled ships, ICE vehicles, chemical plants, oil refineries, electrical machinery, etc.
Further progress in chemical engineering results in fuels, pesticides, fertilizers, pharmaceuticals, solid state devices, integrated circuits, lasers, etc.
Chemical engineering is mostly about 'plumbing' - pipes, tanks, valves, etc. Fluid dynamics. It's much closer to physics than chemistry most of the time from what I understand.
Chemistry doesn't have the same stereotypical relationship to engineering that physics does.
(This comment is based on what these terms mean in the UK at an undergraduate degree level. I'm an electronics engineer and I know a couple people who studied chemical engineering. If the terms mean something different elsewhere in the world I'd love to know :) )
That's a case that might be argued, though pinning down both what technology is and quantifying "innovations" (or inovation) is ... notoriously prickly.
My working definition borrows from John Stuart Mill, who identifies technology as the study of means (to an end or goal), while science is the study of causes (how or why things happen, to which I'd add a general notion of "structured knowledge").
There are other forms of knowledge, an interesting topic itself, but I'll skip that.
There's a tremendous set of ancient technologies, which can get expansive depending on your views. Everything from speech to simple machines, textiles, agriculture, medicine, metalurgy and mining, and ancient chemistry and alchemy.
What changed starting, arguably, at some point between roughly 1620 (publication year of Francis Bacon's Novum Organum) and about 1800 (patent expiry on James Watt's enhancements to the Newcomen steam engine) was a change in attitudes to both science and technology (or the practical arts, as they were then called), due to numerous factors. Much of that owed to the availability of better and more abundant (at least in the short term) fuels: coal, oil, and natural gas, and the capabilities afforded by those, especially in metallurgy (greater strength, purity, specifically-tuned characteristics, and of course, abundance), as well as in the understanding of natural phenomena: optics, thermodynamics, elements, electricity, and later radioactivity, affording more capabilities.
There was still a huge amount of pre-industrial, non-industrial technology, much of it originating in China and documented spectacularly in Joseph Needham's Science and Civilisation in China, a 30+ volume opus begun in the 1950s and still in development.
Many studies of technology look at patent filings, which is at best a rather poor metric -- one that's in many ways a bureaucratic, commercial, and ontological artefact. Looking at the costs and derived value might be of more use. I've been looking into an ontology of technological mechanisms, for which I've generally settled on about nine factors (discussed in other comments on HN, as well as elsewhere) which I've found useful.
Much of what is commonly called "technology" today falls into only a very narrow region of that. And much of what is considered economic growth can be traced very specifically to the increased energy available per capita in productive use.
There's also a pronounced set of diminishing returns to increased innovation and R&D, generally. Suggesting an other-than-bottomless reservoir of potential from which to draw.
I was completely unfamiliar with the subject matter but even before I saw the comments I was convinced that people are going to say that what he did is not assembly.
Apologies dang! I guess I observed it as relatively substantive in the given context. But I did study ancient history, so probably have a warped perception of domain-specific quips.
This is something that bothers me a lot. I have traveled to the UK a lot and oh, what a fine nation! I love being in the UK. Good weather, amazing people, great food and drinks!
The literary culture is still very strong in the UK. But what happened to its science and technology landscape? Merely a century ago, it was at the forefront of science and technology. Where did the UK lose its steam? Anyone with historical insights into the UK care to shed more light on this?
Why do you think the weather in the UK isn’t good? Because it rains sometimes? Why’s that ‘bad’? There’s nothing wrong with rain. What do you want instead? Just bland boring sunshine all the time? How dull.
Well it sort of like.. depends. Comparatively speaking UK still has less Sunshines than lots of places on earth, but compare to 20 - 30 years ago UK, ( At least in the Southern Part like London ) now has really hot summer and decent weather.
Certainly here in eastern Australia at the moment, the lack of rain in this early summer is quite depressing. (The widespread bushfires and smoke is likely to be with us for the next few months)
A British friend of mine always took exception to the stereotype of Britain having bad food, his line was that they had the best food in Europe giving as the example all the imported Indian dishes.
That probably WAS true back in the day. But then again so did NYC. It was still a center of upscale dining but there were essentially no interesting mid-level options.
The Internet changed everything. Someone starts a food trend in Austin, and two weeks later it's on the menus in L.A.
London has great food, like most megacities. But although the quality of food across the UK has improved considerably since the mid 90s, I wouldn’t qualify the general level as “great”, especially compared to their southern neighbor...
Well the British are not good at two things, Optimism and Marketing. It is just not their way to go and shout to the world how great they are. DeepMind, ARM, ImgTech, Icera etc..
Although one may argue they are no longer "British" given all of them are no longer owned by British.
Aside: I understand that the decision to give someone a black bar is a subjective one, but I’m curious as to the criteria. Is it specifically people who have contributed in an important way to the computing community? The scientific community? Society at large? Does the number of upvotes on the “has died” article matter? Would love for ‘dang to weigh in.
Thanks for the link. I didn't mean to undermine the story here, it's just been building frustration today and finally one I really want/need to read and blocked again.
NY Times has a monthly threshold. Their obit has some nice social info the guardian does not and vice-versa. If you can use incognito mode to get round the paywall the nytimes article is worth reading. Tim Berners-Lee's mother worked with Brooker!
Bad freekin week for famous people. It's almost as if more people die in winter, and this is the leading edge of the people who could reach a global population.
> The first autocode and its compiler were developed by Alick Glennie in 1952 for the Mark 1 computer at the University of Manchester and is considered by some to be the first compiled programming language.
which seemingly contradicts the implication in the NTY article that Booker was the principal author of Autocode.
It seems Booker and Glennie were coworkers at Turing's lab. Anyone know more?
EDIT: The Glennie claim is sourced to a 1976 paper coauthored by Donald Knuth, where Knuth says
> The unsung hero of this development was Alick E. Glennie of Fort Halstead, the Royal Armaments Research Establishment. We may justly say "unsung" because it is very difficult to deduce from the published literature that Glennie introduced this system.
https://archive.org/details/DTIC_ADA032123/page/n43