"Despite Tanenbaum’s early work on silicon transistors, AT&T did not support further research or advancement of the technology. At the time, Bell Labs was the research arm of AT&T. Although Bell Labs had “a significant technological lead in silicon transistor technology, it stopped doing proper research in the field—partly because it just wasn’t immediately relevant to AT&T’s business—so silicon transistor technology, including the integrated circuit, was done by Intel and Texas Instruments instead,” Tanenbaum said in a 1999 oral history conducted by the IEEE History Center."
> so silicon transistor technology, including the integrated circuit, was done by Intel and Texas Instruments instead
Right. Tanenbaum invented the silicon transistor (which of course was a major accomplishment -- definitely worthy of honor), but what most people think of as a "microchip" (i.e., an integrated circuit with multiple components on the same piece of semiconductor) was invented almost simultaneously by Jack Kilby at Texas Instruments and Robert Noyce (then at Fairchild, later a cofounder of Intel).
I daily drive past the spot where Fairchild was at the time. It's next to the cheapest gas station in Palo Alto (not really saying much) and across the street from the location that used to be Sun's HQ before they moved to where Facebook is now.
Here's another source that disagrees (BTL = Bell Telephone Laboratories):
"In spite of its secondary role as a semiconductor supplier, the course of semiconductor research was shaped by the scientific and engineering achievements at BTL during the 1950s. While most of the semiconductor industry concentrated on manufacturing germanium transistors and diodes during this decade, BTL spent most of its research dollars on silicon devices... Of the ten major events instrumental in the demise of germanium as the preeminent starting material, seven originated at BTL while the other three were derived from BTL research activities. "[0]
That -agrees-. AT&T built a substantial lead in silicon devices during the 1950s, like your source says.
TI and Fairchild pulled firmly ahead in the 1960s; Intel became hugely relevant in the 70s and 80s. AT&T had mostly abandoned silicon research by this time.
I went looking for a recent estimate of the number of transistors that have been manufactured. It appears to be in the range of 10^23. I'm pretty sure there in nothing else anyone has invented that has been replicated in such volume.
It's amazing to think of a thing someone has made or has helped make be so numerous. And a physical thing too not software or words or an idea an actual thing.
Transistors on a CPU are actual things. Intel or AMD CPUs having billions. One CPU the Cerebras Wafer-Scale Engine (WSE) as they describe it "has 2.6 trillion transistors, which make up 850,000 AI-optimized processing units". That's one big chip!
I'm not so sure. Last time I followed the field, modern HDDs were nanostructured, such that each magnetic domain was literally a physically distinguishable object, in a sense. I'm not sure if that's still the case, and I'm also not 100% sure that ever made it in to production, but I'm pretty sure it did.
At least as of the last time I read the domains were a pretty uncontrolled result of the crystalization and each bit is covering a few statistically. Nanopatterning of the sort your describing would let you make the bits smaller and hence weaker by ensuring a separation and friendlier magnetic environment.
Be careful with large numbers: 10^23 shoes would be roughly 854B shoes for every (~117B) human that have ever lived, that is about 14 million pairs of shoes per individual and per day. I am pretty sure there has never been a market for that many shoes.
An average paper clip is 204.8mm^3. 10^23 of those would be 2.048x10^19 L in volume[1]. That would be only two orders of magnitude less than total volume of oceans 1.33x10^21 L [2]. So I think not.
It's incredibly humbling to be reminded how young this field of our is: this site and everything its presence implies (about the Internet, global communication, computers) would not exist were it not for the work of a very-recently-deceased engineer.
Very few other fields (aerospace and modern pharmacology, among them) can say the same thing.
It is interesting to think about about alternative history here: If microchip technology had never been invented for some reason, what would we have developed instead? What is the next best computing technology?
Nowadays, people think of just vacuum tubes and then transistors, but in the 1960s there were a lot of different technologies being used. For example, superconducting cryotrons, magnetic core logic, parametric-phase-locked-oscillator logic, microwave logic, electroluminescent-photoconductor logic, and tunnel diodes. If transistors didn't pan out, there were lots of other technologies that could have taken over. Magnetic core logic (not to be confused with magnetic core storage) in particular was used in several computers and seemed poised for success until advances in transistors killed it off.
I think we would have ended up here anyways: to assume that the silicon transistor would not have been invented without Morris Tanenbaum is to succumb to the Great Man theory[1].
The reality is that Morris was a brilliant man whose life intersected in the right place and time with the arc of history; we should celebrate his life and contributions for that, and not because we think that we'd be stuck in 1950 without them.
Not sure but I think the question is more like "what if physics didn't allow transistors?" rather than if one guy didn't invent them. What's the next best thing, and how far would we have pushed it?
If transistors would not be possible, then lots of other stuff would not exist, like solar panels or digital cameras. Handy electricity conversion, etc.
I think alternativly we would have miniaturized vacuum tubes much more, the ones that powered computers before transistors (and made them huge)
Surprising to me - size reduced vacuum tubes happened and it's still being worked on in as I understand it. There are some quite miniature working vacuum tubes - if you search you'll find more information.
Inventions often seem to be simultaneous, as if the time had come for this problem to be recognized, and for the solution to be possible. However, I do think in some cases, a "great man" can make an invention earlier than otherwise - we should measure genius in units of time.
We wouldn't be stuck in 1950 without him... but maybe we'd now be in 2018 without him.
OTOH the "great man" fallacy encourages the intense and unpredictable work required for many inventions - why undermine that?
You are correct. If I recollect the electric bulb had at least 10 inventors. In fact one of them was an Englishman who had patented it 2 years in England before Thomas Edison had patented it in America.
The radio transmission/reception had two other inventors other than Marconi.
I don't think anyone has ever explored an alternative here. Transistors came out of exploratory research that Bell Labs funded and as the other comments say, this indeed happened very recently! I suppose the silicon industry and the digital age built up so fast that we didn't really have the incentive to explore the alternative?
Interesting hypothetical. If semiconductors were never discovered, I suspect we would have kept shrinking vacuum tubes, eventually down to the point (which has now been demonstrated) that atmospheric pressure is sufficiently low pressure to operate at.
It's incredibly humbling to think that his #2 and #3 life achievements were building practical high field superconducting magnets (making MRI machines possible a decade later) and being CEO of AT&T.
If you haven’t read “The Idea Factory” by Gertner, it’s really worth the time. The book is well written and it covers a very exciting time in our technological progress. Dr. Tanenbaum is discussed in the book along with the rest of the dream team at Bell Labs during those years.
It seems the headline is a bit editorialized with 'Microchip' as he does not seem to have been involved with subsequent developments of the integrated circuit, what microchips commonly refer to.
Though, after checking, there apparently are no clear cut 'inventors' of the integrated circuit, which strikes me as a bit odd.
The article seems to imply that "silicon transistor" would be more precise than "microchip" so I've put that in the title now, at least until someone explains how it's wrong.
Even prior to those guys, Shockley, Bardeen, and Brattain have the 1956 Nobel Prize in Physics for their work. Kilby received the same in 2000 (Noyce passed away in 1990).
This is the same fallacy as when people say Europe doesn't contribute to tech industry because they don't have huge software companies. All the software companies progress on the back of several asian and European companies. Most semiconductor manufacturing equipment is made either by European companies (ASML, Zeiss etc.), or by asian companies (TEL, TSMC, Gigaphoton etc.) This is why ASML till a few years ago used to bill itself as "the most important company you have never heard of".
Fame is never fair and balanced, it seems. I would think there are lots of other engineers, who don't even have a wikipedia article, who made important contributions as well.
Dern. I had a physics lecturer, of all places, DeAnza College (Cupertino, CA), who worked in that lab. OG engineer teaching physics/engineering fundamentals.
Morris Tanenbaum (American, physical chemist / transistor researcher) [1], not to be confused with Andrew Tanenbaum (American-Dutch, CS professor who was responsible for MINIX) [2]. Both had great significance in what we call "computers" today, in their respective fields.
Hah, I read many books by Andrew Tanenbaum. I remember him putting in dry comments like 'It is not known whether CPUs have dreams during their deepest sleep states.'
Andrew Tanenbaum is in a no way Dutch. States as much in his Wikipedia article. There are plenty of Dutch scientists, including in CS, so no need to gekoloniseren another one.
It's 52 years: https://www.cs.vu.nl/~ast/home/cv.pdf, but he's still an American citizen. You be the judge, but it's certainly not unfair to call him American-Dutch.
The tree of technological and scientific developments that enable almost all of us - on this site - to do what we do, narrows back to just a few people and their work. It seems that this man was one of them. Thank you Mr Tannenbaum.
Good call AT&T! /s