Lots of progress in this field lately. Google, IBM, and Intel all say they've built experimental QCs or quantum processors with 50-70 qubits, and a startup called Rigetti plans a QC with 120 qubits. And Microsoft seems to be betting big on exotic physics (Majorana fermions) that could create far more stable qubits, in theory.
With all this investment, quantum computing may be coming soon.
> With all this investment, quantum computing may be coming soon
Assuming we double the number of physical qubits we can use in a quantum computer every year starting now, we might be able to build one capable of running interesting algorithms sometime around 2040.
The general consensus in the field is that we need to have 100,000 times more physical qubits in a machine to achieve scalable error-corrected quantum computation. That represents tens of millions of qubits; our current state of the art is floating somewhere in the low to high tens of qubits. That’s quite sobering compared to breathless tech reporting, and it doesn’t address the decoherence reductions we’ll need either.
Quantum computing is not coming soon, barring an unlikely and unpredictable series of fundamental breakthroughs. The field is still very much in the, “we have no evidence it’s impossible” stage of research.
It sounds like the error correction that we need for a lot of "general" purpose quantum functions requires a lot of qubits. However, it also sounds like we can build some special purpose machines using the 50-100 qubit machines that are coming online now. https://www.youtube.com/watch?time_continue=1&v=BvVciA5iXH4 (Contains mention of work that Scott Aaronson did that could be usable with 50ish qubits. Specifically, generating random numbers that we can trust.)
Scott Aaronson is a brilliant computer scientist, but I honestly don’t see it. There are a lot of these bizarre “applications” coming from quantum computing theorists or physicists. Generally speaking they read as a little out of touch with whatever field they’re proposing the new thing too.
For example, generating random numbers with quantum computers is an interesting theoretical exercise. But it’s completely asinine in practice. The world moved on from information theoretic randomness decades ago because complexity theoretic randomness is basically fine for just about any purpose.
It comes across as grasping at straws when you’re proposing using a quantum computer to generate true randomness. Sure, you could, but what a ridiculously overengineered way of getting random bits... For similar reasons, I’m incredibly skeptical of quantum cryptography. Go find someone who publishes at an IACR conference and ask them for their take on quantum key agreement protocols.
IBM's scientists say that quantum computing won't be here for another 20 years. IBM's latest PR campaigns say 5-10 years. Who do you believe?
Google promised quantum supremacy um... well... last year. Made for some great headlines, with the press breathlessly announcing that the quantum era has arrived. Only, it didn't. Internally, there are two quantum computing groups who are competing for resources and don't seem to collaborate.
Rigetti plans a 120 qubit system, but has yet to calibrate it, so I'm not holding my breath. And their qubits still have astonishingly low connectivity and the fidelity of their gates is so low that a larger qubit count won't get them anywhere. In other news, they laid off half of their software team because their chip isn't actually useful, so a full-fledged software stack was somewhat premature (if quite elegant). Ex-employees have some great opinions about the culture; check glassdoor.
Microsoft's QC efforts are fascinating, but AFAIK they don't even have a plan to build a computer. Word on the street is that huge egos and a lack of production experience are conspiring to prevent progress.
And the funny thing about these quantum computing efforts: nobody even has a plan for error correction. For all Scott Aaronson rails against DWave... he doesn't dig deep into gate-model efforts, to expose their flaws. And strangely, he's still focused on coherence time of DWave's qubits.
There's a fundamental difference between gate-model and adiabatic qubits: gate-model qubits in the "0" state are at a ground level, and "1" is an excited state -- whereas DWave's qubits have "+1" corresponding to clockwise current, "-1" corresponding to counterclockwise current; and both are ground states of the qubit. Coherence time is roughly the amount of time it takes an excited state to decay into a ground state. Adiabatic transitions begin, remain, and end in ground states, barring external perturbations -- and if such an event occurs, a low coherence time is actually preferable. Higher coherence times are still beneficial to adiabatic quantum computers like DWave's and Google's G-mons, since it boosts long-range interactions between qubits, but it isn't the showstopper that Aaronson hypes it to be.
So gate-model quantum computing is promising, in theory. But there are some very real, very difficult roadblocks to clear before we get there. And for all that DWave gets omitted from such discussion... well, the theorists say that adiabatic and gate-model quantum computing are actually equivalent[1]. Wake me up when somebody else can factor arbitrary 10-bit integers[2] -- and even so... 10-bit factoring problems don't really scream "quantum computing may be here soon" -- more like "1970 called, and says quantum computing is crushing it."
I thought Gil Kalai had a very reasoned opinion on this considering the need for error correction, which estimates were about ~ 100 noisy qubits per "true qubit". Current progress is nowhere near the required 1000s of coherent qubits, nor has it proven effective for "noisy" problems like chemical simulations.
"if you’re doing superconducting qubits, which is maybe the most popular approach today, then at a bare minimum, you need to cool everything down to 10 mKB,"
Anyone know what that temperature is supposed to be? Googling for "mkb" doesn't come up with anything I can scrounge up. mK for milli-Kelvin doesn't make much sense, as you don't need to get that low for superconductivity in general, and that's a pretty hard temperature to reach.
10 milliKelvin is the temperature range that allows the usage of commercial RF equipment in the ~5GHz range to control the qubits.
The relationship between the two is essentially Energy = k_B * temp = h * freq, where k_B is the Boltzmann constant and h is Planck's constant.
If the temperature was higher than 10 milliKelvin, then thermal fluctuations could have comparable energy to your control signals and thus cause unintended electronic state transitions.
It's not just superconductivity that you need. You also need to reduce thermal noise to maintain coherence between qubits. mill-Kelvin is the typical temperature range D-Wave computers go to [1], for instance.
In fact, when you see large mainframe-like pictures of quantum computers, the big box mostly contains cooling equipment. The actual circuitry is pretty small.
Well, there's a pretty cool DJ with the username "mkb" across a few different sites. Aaronson is correctly noting that any superconducting quantum computer needs to be at least ten times as cool as that guy in order to work.
Wow! This is the best explainer of quantum computing I have ever read. Scott does a great job at capturing the essential challenges and potential applications, without embellishing or being overly pessimistic. Good stuff.
One of the world's smartest folks. In one of the world's most interesting fields. Is kind enough to take time away from his research, and his family, to share with us dummies.
Not a world-class deliverer of speeches. Seems like a forgivable flaw to me.
No idea about talks, but I've had two chances to participate in a living room group discussion with him, and I remember both fondly. He is one of the smartest people I've met, and on top of his field.
With all this investment, quantum computing may be coming soon.
ETA: links
https://www.technologyreview.com/s/609451/ibm-raises-the-bar...
http://ai.googleblog.com/2018/03/a-preview-of-bristlecone-go...
https://newsroom.intel.com/press-kits/quantum-computing/#49-...
https://medium.com/rigetti/the-rigetti-128-qubit-chip-and-wh...