Hacker News new | past | comments | ask | show | jobs | submit login
Transistors are civilization’s invisible infrastructure (ieee.org)
162 points by danboarder on Dec 9, 2022 | hide | past | favorite | 81 comments



As a child, presumably like many others, I was pretty enthralled by the concept of magic, magical formations, spells, and etc. And it was equally disheartening to not find oneself in such a world. But growing up, I realized that we already have magic in our world, which we call electricity. We have formations to harness its powers, which would be the chips, and we have spells aka coding to bring them all together to literally perform magic.


This dawned on me as well when I watched a Feynman video on electromagnetism. He talked about that you can explain the "how" of its working, but not the "why". The fundamental forces of the universe are basically magic fields. And this is true for anything that is fundamental and can't be deconstructed further into more primary components.


This! Its not an 8 layer ISO stack (https://en.wikipedia.org/wiki/OSI_model), there many layers below the first layer some of which nobody understands.


Layer one encompasses the physical layer so I'm not convinced there's an additional layer to be had. It does require a deep understanding of the physics involved though.


We have a clear and clean description of the apparent bottom layer of the universe, but we also have clear evidence that it's incomplete, that there's something else going on. So, physicists are spending billions of dollars trying to look one layer deeper and find "new physics." To no avail, yet...


Not sure it you know Terry Pratchett and his discworld series but this was his exact train of thought.

http://www.telegraph.co.uk/culture/books/authorinterviews/10...



Danny Hillis makes a similar analogy in 'The Pattern on the Stone', where he reasons that if someone 200-300 years ago where told what he did for work he'd be burned as a witch.


400-500 might be closer to accurate. 200-300 years ago was deep into the Enlightenment and well past the Scientific Revolution. Witch trials were already a fringe and rare thing by the 1600s.


>Witch trials were already a fringe and rare thing by the 1600s.

"The period of the European witch trials, with the most active phase and which saw the largest number of fatalities seems to have occurred between 1560 and 1630.[31][5] The period between 1560 and 1670 saw more than 40,000 deaths.[32]"

from https://en.m.wikipedia.org/wiki/Witch_trials_in_the_early_mo...

Did you find any actual statistics? I did not, but as far as I know witch trials happened after the reformation and somewhat coincided with the enlightenment.


this also makes sense, because witch hunting socially would be a response to the successes of the scientific revoltion.

Without the results/successes of tech developments, we probably wouldn't have as much witch burnings and hunts.

Think about doctors walking around healing people, doing surgeries. That's basically what a witch does. Where else do they get their power?

the Massachusetts witch trials were 1692-93, and pardons only started being given out well into the 1700's

https://www.smithsonianmag.com/history/a-brief-history-of-th...

There's also the critical theory about how the Enlightenment is only the Enlightenment in retrospect. During that time, it would have been a miserable existence full of broken and exploited people, and people believing in totally different realities and agendas.

It's like someone looking back 500 years in the future would would say "during the covid era science developed significant breakthroughs in universal vaccine technology"

Where as looking from today, it's all about vaccine/mask denialism, protests about 5G microchips, mass information, mass deaths (a lot of people died in the last 2 years from disease), etc.


>There's also the critical theory about how the Enlightenment is only the Enlightenment in retrospect. During that time, it would have been a miserable existence full of broken and exploited people, and people believing in totally different realities and agendas.

That's not a theory, that's the obvious interpretation of what everyone knows about the time period, including the people who were living in it. Even the texts of the enlightenment are often about how bad they think their society is.


>Think about doctors walking around healing people, doing surgeries. That's basically what a witch does. Where else do they get their power?

Healers in the middle ages performed life saving surgeries, even on people with serious injuries to their skulls. (This isn't supposed to be a scientific defense of medieval healing techniques, but these were skilled people with limited tools and knoledge trying their best. Same as today.) Successfull healers weren't considered witches, because they were recognized as a legitimite profession and the catholic church largely regarded "healing magic" as pagan superstition.


An everyday lunacy to me is people walking around using their smartphones and blathering things that if they only knew the technology imbued into that smartphone they might be crushed by the physical force of the irony.

Listening to some teenager complain about how they "hate nerds" or <insert pseudoscience> or <insert antiscience idiocy> with a device that is using quantum-level physics, materials engineering, electrical engineering, math, computer science, chemistry, unimaginable nerd-centuries of effort.

Modern humanity exists in a miracle of technology. Witchcraft to people only 300 years ago. And to say nothing of heating, cooling, transportation, agriculture, medicine, etc.

Transistor/Information Technology however may be building such a stack of technology dependence that is easily teetered and makes our whole civilization much more fragile to disruption. We saw it with Covid, about as insignificant a disruption as could be imagined, basically a test run for globalized trade and interdependence. Generally we failed.

But then again I'm reaching the age where "prepping" starts to infect the mind.


I think you're making a mistake by assuming that the Enlightenment and Scientific Revolution were universal phenomena, and not localized due to the world being a much larger and slower place.


It would very much depend on who you spoke to from the 1600s. If you spoke to an enlightened philosopher or scientist sure, but trying to explain it to people in a rural village may have a worse outcome for you.


1833 | In the United States, a Tennessee man was prosecuted for witchcraft.


I had to look up this story, and it's about as ridiculous as it sounds. This source places the event in 1835, but I believe it's the same incident:

http://southernghoststories.com/the-witch-of-fentress-county

>When Joseph Stout went before Judge Abraham Caruthers, Attorney General John B. McCormick refused to prosecute him. The witch-hunters tried to argue with Judge Caruthers and cited the Statutes of Henry VIII and James I which made witchcraft a felony. The judge shot them down and told them that the statute only applied in England and had no standing in the state of Tennessee.


Witch trials were supposed to be a modern scientific and legal process. They didn’t happen in a systematic matter in the medieval period. So they are really an artifact of a narrow time period, bounded on one side by less proscriptive and thorough government and by Enlightenment philosophy about metaphysics and prosecution on the other.


> 400-500 might be closer to accurate. 200-300 years ago

Probably more an issue with my memory than with Hillis' analogy.


This talk on how silicon chips are manufactured, called "Indistinguishable from Magic" shows just how magical they are: https://www.youtube.com/watch?v=NGFhc8R_uO4


As I say -- we imbue stones engraved with runes with lightning, making them come alive and do our bidding. Tell me again computers are not magic?


I grew up with Harry Potter and now I express that I eventually became a wizard. I just scribble runes onto paper, then do clicky-clacks on my keyboard translating the runes into spells and make machines do magic.


I had a dream once where I somehow was in the far, far future. At first everything seemed like it had regressed or gone back to a more middle ages / middle earth type aesthetic. But then it became obvious that this was intentional - people had rejected a lot of technology and hidden others. So there were many things that did indeed seem like oldschool magic but were actually just powered behind the scenes by some very advanced tech.


See Ursula K. Le Guin's "Always Coming Home": https://en.wikipedia.org/wiki/Always_Coming_Home


This makes for a great writing prompt. In some ways, you're describing the Warhammer 40k universe. The people of that world don't necessarily "reject" technology, but any understanding of it is lost, and things are maintained through ancient relics; the meaning behind which is completely unknown. It's a great fiction premise.


IIRC, One of the alien species in the Stargate series (filling the thematic role of "space elves") appeared primitive because their "technology" was "sufficiently advanced" to the point of being practically magic. I think there were examples in Star Trek as well but I can't remember specifics off the top of my head.

It would be an obvious strategy to hide from the Borg but I don't know whether or not that ever happened. One could imagine a world where complex technology and knowledge was outlawed out of fear of "demons" when their far more advanced ancestors discovered the Borg umpteen thousand years ago and forcibly regressed their society to a more primitive state out of sheer terror, but no one understand the real context behind what's become just a deep cultural fear.

I'm also reminded of the claim by someone in the UFO community - I think it's Robert Bigelow, but it doesn't really matter because it's all bullshit anyway - that UFOS are actually powered by ESP, they aren't even technology in any comprehensible sense, more like technology-shaped talismans that run on the power of space magic and make-believe. It would be hilarious if that were true, and somewhere in the bowels of some ultra top-secret underground base someone is trying to reverse engineer some crashed alien "tech" and being utterly baffled because it literally makes no sense.


Fun story about a world with programmable magic: https://www.themagineer.com/


And also the Laundry Files where computation is magic (and also thins the walls of reality and allows the horrors that chitter behind to, well, not be behind it).

Also hydrofluoric dragons.



> But growing up, I realized that we already have magic in our world, which we call electricity.

Today its kind of boring to most people (not me obviously) but imagine how magical it was upon discovery and propagation. A common need in industry is to turn a shaft which we now do with electric motors. Prior to the motor the only installable, on-demand source of rotational power was a steam engine. It had lots of moving parts, required a boiler, and fuel source that was either coal or wood and had to be carted in by horse. Then you had to hire skilled and trained operators to maintain the engines and the boilers.

With electric all you do is connect two or three wires to a hunk of iron and copper and a shaft supported by two bearings spins. Just make sure the bearings are oiled, brushes (DC motors were once common) are in proper order, and you're good to go. There is no dangerous combustion or flue gas, pipes, or burn hazards, no fuel storage or hazards, no bulky boiler and no need for boilermen. In an instant an invisible force is pushing the shaft around who's only wear items are serviceable bearings. If you needed light to see the machines these motors operated just connect two wires to a glass sphere which emitted a bright illuminating glow. That's magic.

Now thanks to modern high power transistors and smaller faster transistors in microchips we can tame this magical force much more accurately and cheaply enabling electric cars, led lighting, solar and renewables, high efficiency switching power supplies, and so much more.

All this thanks to the magical invisible force of electricity and electromagnetism tamed by our equally magical semiconducting devices. The modern world is literally moved by these devices.

I consider wireless to be black magic which is also enabled by really fast transistors.


Yes and in dark souls magic spells are enabled by the intelligence stat


Sadly this analogy begs the comparison to the Faith stat giving you magic defense. :(


> to literally perform magic.

Yeah but the magic happens mostly behind a flat surface that we call a screen.


Mostly... unless you get a few servos and an Arduino. That's all I needed to start down my path from my "below the glass" 100% software job to working in robotics and making the magic work "above the glass".


Well we also have these magic machines that drive themselves around. And pretty soon we’ll be able to use our magic surfaces to order them to take us places.


Most of the magic happens inside the silicon chip with electricity inside it, you have to flatten the rock and put lightning inside.


You put the lightning inside by using a mystical combination of poisons (dopants) and engravings (rubylith).


Ok this addition was just so cool, thank you for adding it!


And watch cat videos. :)


The high-quality moving images on a flat screen still feel like magic.


Would it be fair to call the past 5-7 decades the "Silicon Age" or the "Transistor Age"?


I mean transistors are neat and all, but what really sets things apart for me is integrated circuits. When you look at the size of the discrete MOnSter 6502 CPU[1], featured here[2] recently, and realize it has around 3k transistors and the latest CPUs and GPUs have several billion...

And not just transistors, but analog circuits as well, allowing for extremely compact designs.

[1]: https://monster6502.com/

[2]: https://news.ycombinator.com/item?id=33841901


The real driving force for about the last 40 years--starting with PCs, then larger systems, and eventually almost everything--was integrated circuits using CMOS technology. CMOS process shrinks have been an incredible lever. Not the only one of course but a major one. The transistor itself was an incredibly important inventions of course but it's the ability to pack so many of them economically into a very small amount of space that really delivered on its promise.


I think the transistor age would make more sense, it allows the age to cover a broader stretch of history. Think about things like the stone/bronze/iron ages, it's not any one new tool but the base technology itself. So if silicon gets supplanted I don't think we will stop using transistors.

But who knows with quantum photonics, maybe laser/optic circuits will supplant a large chunk of silicon transistors in the next 100 years


If you're going for a modern version of "Ages of Humanity" akin to Hesiod/Ovid, sure, if not Digital or Information.

But Stone/Copper/Bronze/Iron refer specifically to the dominant material for crafting weaponry. In that sense, maybe we're in the transition between Steel and Plastic, though maybe a case could be made for Lead.

Or we really are in the Nuclear Age, and it becomes the Final Age of Humanity.

Or war has gone digital, and we are in the Information Age, beyond terrestrial materials. In that case, true Space and then Atomic (universal assembler) Ages may follow.


Historians have been calling it the Information Age. Transistors is just gating or amplifying information over electricity.


The IBM 1401 was the most popular computer of the early 1960s, with over 10,000 produced. It was built from germanium (not silicon) transistors. So silicon shouldn't get all the credit.


I wrote a lot of 1401 code in the 60's at UCLA including a 1401 assembler (5 x faster than IBM's), a floating point package, and an assembler for the historic SWAC there. It was a lot of fun.


If you blogged about it, we would all read it.


Transistor, information, or automation could be good I think. Another option maybe — what really distinguishes our civilization right now is that we’ve integrated out economies around the entire planet, so “global age” might be a good candidate.

Silicon is present in sand, so we’ve been using it forever in ceramics, on some level. Like copper, it has been a good friend to humanity for a very long time. The only problem is that “using lots of SI” is not a distinguishing characteristic for an age!

We call it the Iron Age because iron is a grumpy, unfriendly element that didn’t want to help out until we got some pretty fancy forges.

Maybe we could use it to mark some larger scales. Silicon age could start around 25k years ago with the invention of pottery. Before that, I dunno, fire age or rock age.


“Silicon Age” is being used, though “Information Age” and “Digital Age” are more common: https://en.wikipedia.org/wiki/Information_Age


Transistors made of non-silicon materials have existed for a long time.


No, I think the concept of ages is itself a historiographic fiction that peaked in the 20th century, as an attempt to impose order and narrative on the unstructured chaos of history. But they are one-dimensional, westcentric, and IMHO obsolete.


Historians of the future will probably refer to the dot-com bubble as the start of the Silicon Age. But another take could be that the silicon and transistor age started at the same time with commercial silicon transistors (mid 50s).


Historians lump ever longer stretches of time together the further into the past stuff becomes. I suspect they will start pre WWII with mechanical calculators or focus on computer networks as the defining moment.


Computational, analytical, scientific, or something like that, maybe?


Only time will tell, but at least for now it seems like either will work for the next few thousand years. Iron is still important, and Bronze is still used, but Silicon seems the like backbone of our civilization now.


SI (in our ceramics) has been a popular construction and pottery material forever, pottery shards are, after all, one of the main ways that we distinguish which ancient cultures were spreading where. The problem with “Silicon age” is that use of lots of silicon is not a distinguishing factor for an age, haha.


For the next few thousand years? I sure wouldn't make a prediction about what technology is going to be important with a time horizon that long. Vacuum tubes are barely a century old.


Let's reserve the term "Silicon Age" for when they take over.


"The Computing Machines in the Future" (1985) http://cse.unl.edu/~seth/434/Web%20Links%20and%20Docs/Feynma...

In section 3, Reducing the Size, he speculates having a billion transistors in a computer, then immediately qualifies his statement.

That was almost thirty years ago.


Whenever I read sci-fi stories and there is talk of self replicating robots/factories/machines or even generation ships, there is a huge jump in technology/magic that is never discussed - what technology is used to replace semiconductors - because a portable semiconductor fab seems so outside the realm of possibility now. It would be such a revolutionary change that it's outside the realm of speculation besides handwaving at bioengineered replacements or easy atomic level manipulation ala The Culture series.


My thinking is: there are many ways to make matter do compute. Semiconductor-based digital logic chips are only one of many possibilities.

You can make an analog computer from just about anything, it's a matter of identifying things you can interpret as flows and stores (or, as the professors who taught control systems at my uni, faucets, drains and bathtubs), and arranging them just right. You can make a digital computer from just about anything too - it's a matter of identifying things you can interpret as a NAND gate, and stacking them just right. You can encode a neural network model as grooves in plastic sheets, stack them into layers, and have it naturally process incoming light. Etc. The possibilities are limitless, because computing is more about your mental model, and less about the substrate itself.

So I imagine, the breakthrough for nanotech will be when someone figures out a way to make some kind of programmable computer that works at nanoscale, can be easily produced, and is somewhat robust against the environment. It doesn't need to be a fast computer - we're used to multi-GHz CPUs and multi-MHz microcontrollers, but at nanoscale, even an equivalent of an old 8008 will do, or even something much weaker: single nanobot doesn't need much compute. You'll be deploying them by thousands or millions at a time anyway, and you'll want to program them as a swarm anyway.



I assume in those stories that nanotech has displaced photolithography in the fabrication of microelectronics. If you can get nanoassemblers to reliably produce a simple CPU, you can pick up the slack through software and bootstrap from there.


> In fact, each of us is surrounded by billions, if not trillions of transistors, none of which are visible to the naked eye.

* glances over at bins of assorted TO-92 and TO-220's *

checkmate, Mr. Goldstein.


Fair, though even still, those are hidden away in little plastic boxes. The actual meat of a transistor isn't really visible except by analogue, eg observing an equivalent behaviour going on inside a glass vacuum tube.


There are pretty much only three major inventions (he claimed boldly, hoping to spark a fun and informative discussion):

The Haber-Bosch Process, without which none of us would be here reading this today.

The transistor. 'natch.

Rare earth magnets (which enable small strong motors, which enable tiny drones and factories, which revolutionizes economics.)


And all made possible by abundant energy from petroleum products.


The lesson is that any damned fool can have a prospering economy when he's got energy to burn. But in the transition from petroleum to fusion & renewables, it is necessary to be a bit more clever. Rotsa ruck!


This article has a link to the IEEE electonic devices meeting, and flipping through the presentations, optical interconnects come up, leading to the still rather sci-fi notion of the photonic computer:

https://en.wikipedia.org/wiki/Optical_transistor

> "In principle, all-optical digital signal processing and routing is achievable using optical transistors arranged into photonic integrated circuits. The same devices could be used to create new types of optical amplifiers to compensate for signal attenuation along transmission lines."


Indeed.. just completed "Chip War" by Chris Miller which delves into the history and the Geopolitics of this wonderful invention. After WWII Japan, Singapore, Taiwan, South Korea owe their prosperities to the transistor.


Your comment reminds me of a line by Boromir (Sean Bean) in The Lord of the Rings: The Fellowship of the Ring.

"It is a strange fate we should suffer so much fear and doubt… over so small a thing. Such a little thing."

Where the little thing here is the transistor.


Which reminded me of the mine scene from Back to the Future 3

Doc Brown examining a microchip with a magnifying glass

"Unbelievable that this little piece of junk could be such a big problem. No wonder this circuit fails, it says 'Made in Japan'."

Marty "Whadda mean, Doc? All the best stuff is made in Japan."

Doc "Unbelievable"


Transistors are the dominant life-form on Earth.


Transistors form the nervous systems of the dominant life form, motor vehicles.


I can't but think of this analogy: Transistors are like Judo. You can use them by keeping your feet firmly on the ground, then applying a small amount of energy to a point (base current) you can control and use to your advantage a much bigger energy that comes from the opponent (collector current through the load).


I'd be willing to say electromagnetic waves and signals are the true invisible, magic infrastructure. But of course transistors and ICs do most of the magic.


Wifi still blows my mind.


Isn't this 20th century history 101 ? Not much to dispute here, but might bear repeating.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: