We are currently going down a path of chip 'packages' where we put a bunch of discrete cores and management chips into an enclosure that drops into the processor slot.
Setting aside computing with photons (we seem to be making more progress with quantum computing), when are we going to see optical communication hardware shrink to the point where we can get more bandwidth between these chips than copper interconnects?
Or does copper have a head start in signal processing that IC optics can't surmount?
The essential problem is photonics sounds great in principle, but once you account for all the energy in an actual implementation, you end up falling behind in bandwidth and higher on power to electronics, until you reach a certain critical distance where light wins.
The problem with ever-smaller transistors isn't actually a lack of ideas. Optical is promising! 2.5D FinFET is already in production. Gate-all-around transistors are past the research stage.
All of these new technologies are a drastic change to what semiconductor fabs are used to manufacturing. That means new tools, new training, new testing processes, and an unproven economic model.
The problem with having too many choices is that companies must choose, and invest heavily in one option. If that fails to scale up to industrial capacity, it could ruin the company.
I like optical because it's resistant to EMP, but that might just be my taste in films like the Matrix.
Speaking of things I wish were real, whatever happened to the DNA-based computer I was promised 30 years ago? Or the memristors that would allow instantly booting back to the state my computer was in when I turned it off last night?
DNA computers? I think the answer is “we outclassed them”. At 10nm, transistors are about the size of 10 base-pairs at this point, but much faster and much less error prone.
I think DNA has semiconductors beat on average information density though. A tube of DNA can contain a datacenter’s worth of information. Read times are abysmal and error rate is through the roof, but it’s relatively stable on the order of hundreds of years.
Just isn’t something in super high demand. That and density are probably the only major advantages right? And I don’t imagine the equipment to "read" that data will be very compact in the foreseeable future.
If the demand was high enough I bet there would already be profitable companies pursuing it. I do see it as a potential market though. It’s no fun having a cold storage archive that needs constant hardware replacements.
I would imagine reading would be done in a similar way as it is today: amplification and sequencing. Data would have to be given a hefty amount of data correction. There are no apparent show stoppers that I’m aware of, just not enough interest.
We're still not even close for energy efficiency though. I can't remember the exact figures, but for most tasks the Brain is much more efficient per joule than any CPU.
This is true for things people have evolved to do, but a cell phone can do a lifetimes worth of arithmetic in less than a second using negligible energy relative 2000 calories per day for 70 years
True, though I don’t think that’s a “DNA” computer in the same way that merely using semiconductors doesn’t makes my laptop a “quantum” computer.
Going with the Kuzweil estimate on Wikipedia, human brains are about 1e15 ops/Joule, whereas the best in the Green500 list (June 2019) is 1.76e10 ops/Joule.
> Or the memristors that would allow instantly booting back to the state my computer was in when I turned it off last night?
I would agree with you, but with the current ability to suspend, I don't see the point. I always just suspend when I would otherwise power-off. The only times I power-off now are when I don't use my laptop for many days and the battery just completely drains itself, when I want to load an updated kernel, or when the power goes out in the neighborhood. They're all pretty rare events.
If you're talking about using DNA to solve NP hard problems the reason is that it just kicked the problem to DNA quantity. While the computation was theoretically constant time, the DNA required would be exponential. If you make it suboptimal then may as well using an Ising computer.
If you're talking about DNA storage, then cost, durability, speed, energy, reliability, size, etc. Are all issues.
I think, and don't quote me here, Optane is still an order of magnitude away from RAM in terms of IOPS (read latency).
It's massively better than traditional SSDs, but RAM is still waaayyyyy better.
So, Optane could be useful as boot drive and fast-ish swap space, but that's probably it. In most applications Optane won't feel that different from your usual nvme discs. I think. I can't afford one, haha
Yes, there's probably a limit to how small you can make a transistor. But there are far more fundamental limits to computation, such as how efficient it's possible for a boolean operation to be:
Setting aside computing with photons (we seem to be making more progress with quantum computing), when are we going to see optical communication hardware shrink to the point where we can get more bandwidth between these chips than copper interconnects?
Or does copper have a head start in signal processing that IC optics can't surmount?