Hacker News new | past | comments | ask | show | jobs | submit login

I’m a quantum dabbler so I’ll throw out an armchair reaction: this is a significant announcement.

My memory is that 256 bit keys in non quantum resistant algos need something like 2500 qubits or so; and by that I mean generally useful programmable qubits. To show a bit over 100 qubits with stability, meaning the information survives a while, long enough to be read, and general enough to run some benchmarks on is something many people thought might never come.

There’s a sort of religious reaction people have to quantum computing: it breaks so many things that I think a lot of people just like to assume it won’t happen: too much in computing and data security will change -> let’s not worry about it.

Combined with the slow pace of physical research progress (Schorrs algorithm for quantum factoring was mid 90s), and snake oil sales companies, it’s easy to ignore.

Anyway seems like the clock might be ticking; AI and data security will be unalterably different if so. Worth spending a little time doing some long tail strategizing I’d say.




You need to distinguish between "physical qubits" and "logical qubits." This paper creates a single "first-of-a-kind" logical qubit with about 100 physical qubits (using Surface Code quantum error correction). A paper from Google in 2019 estimates needing ~20 million physical qubits ("How to factor 2048 bit RSA integers in 8 hours using 20 million noisy qubits" - https://arxiv.org/abs/1905.09749), though recent advances probably brought this number down a bit. That's because to run Shor's algorithm at a useful scale, you need a few thousand very high quality logical qubits.

So despite this significant progress, it's probably a still a while until RSA is put out of the job. That being said, quantum computers would be able to retroactively break any public keys that were stored, so there's a case to be made for switching to quantum-resistant cryptography (like lattice-based cryptography) sooner rather than later.


Thank you for the explanation. It's still an upwards update on the qubit timelines of https://arxiv.org/pdf/2009.05045 (see Fig. 7), but not by an insane amount. We've realized their 95% expectation of qubit progress (1 logical qubit) for 2026, in 2024.92 instead.

Which to be clear is quite a bit faster than expected in 2020, but still within the realm of plausible stuff.


Nice, thanks for linking that paper. I also did below.

The authors argue (e.g. in the first comment here https://scottaaronson.blog/?p=8310#comments) that by their definition, Google still only has a fraction of one logical qubit. Their logical error rate is of order 1e-3, whereas this paper considers a logical qubit to have error of order 1e-18. Google's breakthrough here is to show that the logical error rate can be reduced exponentially as they make the system larger, but there is still a lot of scaling work to do to reach 1e-18.

So according to this paper, we are still on roughly the same track that they laid out, and therefore might expect to break RSA between 2040 and 2060. Note that there are likely a lot of interesting things one can do before breaking RSA, which is among the hardest quantum algorithms to run.


How to understand that logical error rate can reduce exponentially when the system becomes larger? Does it mean each physical qubit will generate more logical qubit when the total number of physical qubits increases?


A simple classical example where this is true is a repetition code. If you represent 1 as 11...1 and 0 as 00...0, randomly flip some bits, and then recover the logical state with majority voting, the probability of a logical error occurring shrinks exponentially as you add more bits to it. This is because a logical error requires flipping at least half the bits, and the probability of that happening decreases exponentially.

The error correcting code used in this work also has a nice intuitive explanation. Imagine a 2D grid of bits, visualized as pixels that can be black or white. Now imagine drawing a bunch of lines of black pixels, and enforcing that lines can only end on the top or bottom boundary (loops are allowed). If there is an even number of lines connecting the top to the bottom, we call that logical 0, and if there is an odd number of lines, we call that logical 1. This again has the property that as you add more bits, the probability of changing between logical 1 and 0 gets exponentially smaller, because the lines connecting top to bottom get longer (just like in the repetiton code).

This code also has the nice property that if you measure the value of a small patch of (qu)bits, there's no way to tell what the logical state is. This is important for quantum error correction, because measurement destroys quantum information. So the fact that local measurements don't reveal the logical state means that the logical state is protected. This isn't true for the repetition code, where measuring a single bit tells you the logical state.


> so there's a case to be made for switching to quantum-resistant cryptography (like lattice-based cryptography) sooner rather than later.

This.

People seems to think that because something is end to end encrypted it is secure. They don't seem to grasp that the traffic and communication that is possibly dumped/recorded now in encrypted form could be used against them decades later.


Well. Yes, but currently there are no well tested (ie. recommended by the ITsec community) post-quantum cryptosystems as far as I understand.

https://crypto.stackexchange.com/a/61596

But ... AES is believed to be quantum-safe-ish, so with perfect forwards secrecy this exact threat can be quite well managed.

The currently best known quantum attack on AES requires a serial computation of "half of key length" (Grover's algorithm ... so if they key is 128 bit long then it requires 2^64 sequential steps)

https://www.reddit.com/r/AskNetsec/comments/15i0nzp/aes256_i...


Google uses NTRU-HRSS internally, which seems reasonable.

https://cloud.google.com/blog/products/identity-security/why...


Signal and Apple both use post-quantum.


I read about Signal's double-trouble tactics, but I haven't heard about Apple's.

Ah, okay for iMessage, something called PQ3[1], hm, it uses Kyber. And it's also a hybrid scheme, combining ECC. And a lot of peer review.

And there's also some formal verification for Signal's PQXDH [2].

Oh, wow, not bad. Thanks!

Now let's hope a good reliable sane implementation emerges so others can also try this scheme. (And I'm very curious of the added complexity/maintenance burden and computational costs. Though I guess this mostly runs on the end users' devices, right?)

[1] https://security.apple.com/blog/imessage-pq3/ [2] https://github.com/Inria-Prosecco/pqxdh-analysis


This is correct. I worked in quantum research a little.


any books for beginners that you recommend?



This is an outstanding resource, thanks!


Dear rhubarbtree,

I'm very sorry to admit it but I'm too lazy to read the entire discussion in this thread. Could you please tell me, a mere mortal, at which point the humanity should start worrying about the security of asymmetric encryption in the brave new world of quantum computing?


Funny turn of phrase, but you probably need 20 million physical qubits (or at least you did a few years ago, this number may have dropped somewhat) to do anything significant. I don't think we can be confident we'll get there any time soon.

> quantum computers would be able to retroactively break any public keys that were stored

Use a key exchange that offers perfect forward secrecy (e.g. diffie Hellman) and you don’t need to worry about your RSA private key eventually being discovered.


Diffie-Hellman isn't considered to be post-quantum safe: https://en.wikipedia.org/wiki/Shor%27s_algorithm#Feasibility...


> Forward secrecy is designed to prevent the compromise of a long-term secret key from affecting the confidentiality of past conversations. However, forward secrecy cannot defend against a successful cryptanalysis of the underlying ciphers being used, since a cryptanalysis consists of finding a way to decrypt an encrypted message without the key, and forward secrecy only protects keys, not the ciphers themselves.[8] A patient attacker can capture a conversation whose confidentiality is protected through the use of public-key cryptography and wait until the underlying cipher is broken (e.g. large quantum computers could be created which allow the discrete logarithm problem to be computed quickly). This would allow the recovery of old plaintexts even in a system employing forward secrecy.

https://en.wikipedia.org/wiki/Forward_secrecy#Attacks


I’m talking specifically about RSA being eventually broken. If just RSA is broken and you were using ECDHE for symmetric keying, then you’re fine.

The point is that you can build stuff on top of RSA today even if you expect it to be broken eventually if RSA is only for identity verification.


The relevant RSA break is sufficiently powerful quantum computers, which also break ECDH (actually, ECDH is easier than classically equivalent-strength RSA for quantum computers[1]), so no, you’re not fine.

[1] https://security.stackexchange.com/questions/33069/why-is-ec...


I would actually expect RSA to see a resurgence due to this. Especially because you can technically scale RSA to very high levels potentially pushing the crack date to decades later than any ECC construction. With the potential that such a large quantum computer may never even arrive.

There are several choices with scaling RSA too, you can push the primes which slows generation time considerably. Or the more reasonable approach is to settle on a prime size but use multiple of them (MP-RSA). The second approach scales indefinitely. Though it would only serve a purpose if you are determined to hedge against the accepted PQC algorithms (Kyber/MLKEM, McEliece) being broken at some point.


If you don't mind a one terabyte public key. https://eprint.iacr.org/2017/351.pdf


also that paper (IMO) is ridiculously conservative. Just using 1GB keys is plenty sufficient since it would require a quantum computer with a billion bits to decrypt.


How long does it take to generate a key that big? What probabilities do you need to put on generating a composite number and not a prime? Does the prime need extra properties?


Based on https://eprint.iacr.org/2017/351.pdf it would be about 1900 core hours (but I'm pretty sure optimized implementations could bring it down a bunch). No extra properties needed and moderate probability is sufficient.


Although I know it’s an apocryphal quote, I am reminded of “640K should be enough for anybody.”

The Intel 4004, in 1971, had only 2,250 transistors.

A handful of qubits today might become a billion sooner than you think.


it took until 2011 before Sandy bridge cracked 2 billion. If we get 40 years of quantum resistance from 1GB RSA, that would be pretty great.


Perfect forward secrecy doesn't work that well when NSA motto is - store everything now decrypt later. If they intercept the ephemeral key exchange now they can decrypt the message 10 or 50 years later.


Diffie Hellman doesn’t ever send the key over the wire, that’s the point. There is nothing to decrypt in the packets that tells you the key both sides derived.

Unless they break ECDHE, it doesn’t matter if RSA gets popped.


Diffie Hellman to the best of my understanding also relies on the same hard problems that make the public key cryptography possible. If you trivialize factoring of big numbers, you break both RSA and the original DHE. Not sure how it will work for elliptic curves, but my instinct tells me that if you make the fundamental ECC problem easy, the exchange will also go down.


According to the top image on the Wikipedia page, Diffie Hellman does send the public key over the wire.

https://en.wikipedia.org/wiki/Diffie%E2%80%93Hellman_key_exc...


wouldn't be surprised if ecdhe isn't quantum resistant.


Something tells me that by the end of the century only the one-time pads will still be holding their secrets.


Even for that to work you need a good random number generator


That's pretty trivial. xor a video camera with AES and no one is decrypting that ever.


And, famously, the camera is pointing at a lava lamp, ha ha.


Honestly not sure how they created one-time pads 100 years ago.


In one case it was people just banging on typewriters randomly.


Forward secrecy is orthogonal to post-quantum safety.


Perfect forward secrecy requires the exchange of ephemeral keys. If you use either ECC or RSA for this and the traffic is captured a quantum computer will break it.

All perfect forward secrecy means is that you delete your own ephemeral private keys, the public keys stay in the record. And a quantum computer will recover the deleted private keys.

Also, none of the currently accepted post-quantum cryptographic algorithms offer a Diffie-Hellman construction. They use KEM (Key Encapsulation Mechanism).


Not exactly, they can just reverse the entire chain.


What chain are you talking about?


The required number of qubits to execute Shor’s algorithm is way larger than 2500 qubits as the error ceiling for logical qubits must decrease exponentially with every logical qubit added to produce meaningful results. Hence, repeated applications of error correction or an increase in the surface code would be required. That would significantly blow up the number of physical qubits needed.


He’s quoting the number of logical qubits (which is 1024 IIRC, not 2500), after error correction.

ETA: Wikipedia 2330 qubits, but I'm not sure it is citing the most recent work: https://en.wikipedia.org/wiki/Elliptic-curve_cryptography#ci...


I actually thought the number of logical qubits needed was around 20 for factorisation as the state space size is 2^(2^n) and hence did not recognise them as the number of logical qubits required. It is often misunderstood that error correction needs to be done only once, as with classical quantum computers, and the numbers would fit together with one pass of error correction.

The Shor's algorithm requires binary encoding; hence, 2048 logical qubits are needed to become a nuance for cryptography. This, in turn, means that one will always be easily able to run away from a quantum adversary by paying a polynomial price on group element computations, whereas a classical adversary is exponentially bounded in computation time, and a quantum adversary is exponentially bounded with a number of physical qubits. Fascinating...


1024 is for RSA-1024, which is believed to be broken by classical means at this point. Everyone doing anything with RSA is on 4k or larger.


2048. There is no plausible conventional attack on 2048; whatever breaks 2048 is probably going to break 4096, as I understand it.

https://crypto.stackexchange.com/questions/1978/how-big-an-r...


> Everyone doing anything with RSA is on 4k or larger.

The Let's Encrypt intermediate certificates R10 and R11 seem to be only 2048 bit.


A signature (not encryption) with short-term valid life is fine at 2048 still.


They are? The short term recommendation is 3072, and I still see lots of 2048. Actually, it's mostly 2048.


Reminder to anyone if DKIM keys haven't been rotated in a while they might still be 1024. Eg., Google Workspace but new keys are 2048 now.


I took this conversation to be about ECC, not RSA.


My completely unfounded tin foil hat at the moment is that ECC was pushed as a standard not because it was faster/ smaller, but the smaller bit size makes it less quantum resistant and is more prone to be broken first (if not already) via quantum supremacy.


ECC is easier to implement right. Even the older ones that are full of footguns have orders of magnitude less footguns than RSA.

So don't fear them due to unfounded theories.


Is there consensus on what signature validation is correct? https://hdevalence.ca/blog/2020-10-04-its-25519am


IMO, that article has a clear conclusion that you should aim your software at the libsodium-1.0.16 pattern (no edge cases).

The problem it's presenting is more about software on the wild having different behavior... And if "some people connect on the internet and use software that behaves differently from mine" is a showstopper for you, I have some really bad news.


An intersting read. So the general idea as I understand is that one can make signatures that pass or does not pass validation depending on the implementation while all implementations do protect against forgery.

In my opinion the correct approach here is the most liberal one for the Q, R points. One checks each point cofactor at parsing 8P=0 and then use unbatched equation for verification. This way implementations can be made group agnostic.

Having group agnostic implementations is important as it creates a proper separation of concerns between curve implementations and the code that uses them. For instance if we were to accept strict validation as the ground truth and best practice one would have enormously hard time specifying verifiers for zero knowledge proofs and would also double time and code for the implementers without any effect on soundness.


You're right, that's pretty unfounded.


Discrete log doesn't use Shor's algorithm, and appears to need more qubits to break (per key bit).


Is it broken? Seems still no one solved RSA-1024 challenge.


The state actors aren't going to reveal their capabilities by solving an open challenge.


The most recent result on reducing the number of logical qubits is [1]. They show how to use residue arithmetic to factor n bit numbers using n/2 + o(n) logical qubits (they give the example of 1730 qubits to factor a 2048 bit number).

[1]: https://eprint.iacr.org/2024/222


Well, this is embarrassing. I just realised I had wrongly interpreted the result in [1]. I made an error on how Shor's algorithm encodes the numbers, wrongly assuming that numbers are encoded into quantum state space, which is 2^(2^n), where instead, there is one bit encoded into one qubit, which is also more practical.

The result shall be interpreted directly with the error rate for logical qubits to decrease as ~n^(-1/3). This, in turn, means that factorisation of a 10000-bit number would only require an error rate of 1/10th of the number of the logical qubits for a 10-bit number. This is practical given that one can make a quantum computer with around 100k qubits and correct errors on them.

On the other hand, a sibling comment already mentioned the limited connectivity that those quantum computers now have. This, in turn, requires a repeated application of SWAP gates to get the interaction one needs. I guess this would add a linear overhead for the noise; hence, the scaling of the error rate for logic qubits is around ~n^(-4/3). This, in turn, makes 10000-bit factorisation require a logical error rate of 1/10000 that of 10-bit number factorisation. Assuming that 10 physical qubits are used to reduce error by order, it can result in around 400k physical qubits.

[1]: https://link.springer.com/article/10.1007/s11432-023-3961-3


Isn't that what they are claiming is true now? That the errors do decrease exponentially with each qubit added?


What they claim is that adding physical qubits reduce error rate of the logical qubits exponentially. For the Schor algorithm the error rate of the logical qubits must decrease exponentially with every single logical qubit added to make the system produce meaningful results.

To see how it plays out consider adding a single logical qubit. First you need to increase the number of physical qubits to accommodate the new logical qubit at the same error rate. Then multiply the number of physical qubits to accommodate for exponentially decreased error rate which would be a constant factor N ( or polynomial but let’s keep things simple) by which the number of physical qubits need to be multiplied to produce a system with one additional logical qubit with an error rate to produce meaningful results.

To attain 1024 logical qubits for Schor algorithm one would need N^1024 physical qubits. The case where N<1 would be possible if error would decrease by itself without additional error correction.


The error rates given are still horrendous and nowhere near low enough for the Quantum Fourier Transform used by Shor's algorithm. Taking qubit connectivity into account, a single CX between 2 qubits that are 10 edges aways gives an error rate of 1.5%.

Also, the more qubits you have/the more instructions are in your program, the faster the quantum state collapses. Exponentially so. Qubit connectivity is still ridiculously low (~3) and does not seem to be improving at all.

About AI, what algorithm(s) do you think might have an edge over classical supercomputers in the next 30 years? I'm really curious, because to me it's all (quantum) snake oil.


In addition to that, the absolutely enormous domains that the Fourier Transform sums over (essentially, one term in the sum for each possible answer), and the cancellations which would have to occur for that sum to be informative, means that a theoretically-capable Quantum Computer will be testing the predictions of Quantum Mechanics to a degree of precision hundreds of orders of magnitude greater than any physics experiment to date. (Or at least dozens of orders of magnitude, in the case of breaking Discrete Log on an Elliptic Curve.) It demands higher accuracy in the probability distributions predicted by QM than could be confirmed by naive frequency tests which used the entire lifetime of the entire universe as their laboratory!

Imagine a device conceived in the 17th century, the intended functionality of which would require a physical sphere which matches a perfect, ideal, geometric sphere in Euclidean space to thousands of digits of precision. We now know that the concept of such a perfect physical sphere is incoherent with modern physics in a variety of ways (e.g., atomic basis of matter, background gravitational waves.) I strongly suspect that the cancellations required for the Fourier Transform in Shor's algorithm to be cryptographically relevant will turn out to be the moral equivalent of that perfect sphere.

We'll probably learn some new physics in the process of trying to build a Quantum Computer, but I highly doubt that we'll learn each others' secrets.


Beautiful analogy.


Re: AI, it's a long way off still. The big limitation to anything quantum is always going to be decoherence and t-time [0]. To do anything with ML, you'll need whole circuit (more complex than shor's) just to initialize the data on the quantum device; the algorithms to do this are complex (exponential) [1]. So, you have to run a very expensive data-initialization circuit, and only then can you start to run your ML circuit. All of this needs to be done within the machine's t-time limit. If you exceed that limit, then the measured state of a qubit will have more to do with outside-world interactions than interactions with your quantum gates.

Google's willow chip has t-times of about 60-100mu.s. That's not an impressive figure -- in 2022, IBM announced their Eagle chip with t-times of around 400mu.s [2]. Google's angle here would be the error correction (EC).

The following portion from Google's announcement seems most important:

> With 105 qubits, Willow now has best-in-class performance across the two system benchmarks discussed above: quantum error correction and random circuit sampling. Such algorithmic benchmarks are the best way to measure overall chip performance. Other more specific performance metrics are also important; for example, our T1 times, which measure how long qubits can retain an excitation — the key quantum computational resource — are now approaching 100 µs (microseconds). This is an impressive ~5x improvement over our previous generation of chips.

Again, as they lead with, their focus here is on error correction. I'm not sure how their results compare to competitors, but it sounds like they consider that to be the biggest win of the project. The RCS metric is interesting, but RCS has no (known) practical applications (though it is a common benchmark). Their T-times are an improvement over older Google chips, but not industry-leading.

I'm curious if EC can mitigate the sub-par decoherence times.

[0]: https://www.science.org/doi/abs/10.1126/science.270.5242.163...

[1]: https://dl.acm.org/doi/abs/10.5555/3511065.3511068

[2]: https://www.ibm.com/quantum/blog/eagle-quantum-processor-per...


> I'm curious if EC can mitigate the sub-par decoherence times.

The main EC paper referenced in this blog post showed that the logical qubit lifetime using a distance-7 code (all 105 qubits) was double the lifetime of the physical qubits of the same machine.

I'm not sure how lifetime relates to decoherence time, but if that helps please let me know.


That's very useful, I missed that when I read through the article.

If the logical qubit can have double the lifetime of any physical qubit, that's massive. Recall IBM's chips, with t-times of ~400microseconds. Doubling that would change the order of magnitude.

It still won't be enough to do much in the near term - like other commenters say, this seems to be a proof of concept - but the concept is very promising.

The first company to get there and make their systems easy to use could see a similar run up in value to NVIDIA after ChatGPT3. IBM seems to be the strongest in the space overall, for now.


I'm sorry if this is nitpicky but your comment is hilarious to me - doubling something is doubling something, "changing the order of magnitude" would entail multiplication by 10.


Hahaha not at all, great catch. Sometimes my gray matter just totally craps out... like thinking of "changing order of magnitude" as "adding 1 extra digit".

Reminds me of the time my research director pulled me aside for defining CPU as "core processing unit" instead of "central processing unit" in a paper!


Wouldn’t thoose increased decoherence times need to be viewed in relation to the time it takes to execute a basic gate? If the time to execute a gate also increases it may overtake practicality of having less noisy logical qubits.


> Also, the more qubits you have/the more instructions are in your program, the faster the quantum state collapses.

Was this actually measured and published somewhere?


How can I, a regular software engineer, learn about quantum computing without having to learn quantum theory?

> Worth spending a little time doing some long tail strategizing I’d say

any tips for starters?


If you're a software engineer, then the Quantum Katas might fit your learning style. The exercises use Q#, which is quantum specific programming language.

https://quantum.microsoft.com/en-us/tools/quantum-katas

The first few lessons do cover complex numbers and linear algebra, so skip ahead if you want to get straight to the 'quantum' coding, but there's really no escaping the math if you really want to learn quantum.

Disclaimer: I work in the Azure Quantum team on our Quantum Development Kit (https://github.com/microsoft/qsharp) - including Q#, the Katas, and our VS Code extension. Happy to answer any other questions on it.


Is there a reasonable pivot for someone well versed in the software engineering space to get in, or is it still the playground of relevant Ph.Ds and the like? I've been up and down the stack from firmware to the cloud, going on 14 years in the industry, have a Master's in CS, am the technical lead for a team, yada yada, but have been flirting with the idea of getting out of standard product development and back into the nitty gritty of the space I first pursued during undergrad.


> Is there a reasonable pivot for someone well versed in the software engineering space to get in, or is it still the playground of relevant Ph.Ds and the like?

there's no such thing as a practical QC and there won't be for decades. this isn't a couple of years away - this is "maybe, possibly, pretty please, if we get lucky" 25-50 years away. find the above comment that alludes to "2019 estimates needing ~20 million physical qubits" and consider that this thing has 105 physical qubits. then skim the posted article and find this number

> the key quantum computational resource — are now approaching 100 µs (microseconds)

that's how long those 105 physical qubits stay coherent for. now ponder your career pivot.

source: i dabbled during my PhD - took a couple of classes from Fred Chong, wrote a paper - it's all hype.


Start here: https://youtu.be/F_Riqjdh2oM

You don't need to know quantum theory necessarily, but you will need to know some maths. Specifically linear algebra.

There are a few youtube courses on linear algebra

For a casual set of video: - https://youtube.com/playlist?list=PLZHQObOWTQDPD3MizzM2xVFit...

For a more formal approach:

- https://youtube.com/playlist?list=PL49CF3715CB9EF31D

And the corresponding open courseware

- https://ocw.mit.edu/courses/18-06-linear-algebra-spring-2010...

Linear algebra done right comes highly recommended

- https://linear.axler.net/


+1 for 18-06 and Axler. Another, more concrete, option (not sure how much it will help with quantum theory) is Stephen Boyd's "Introduction to Applied Linear Algebra" available online here:

https://web.stanford.edu/~boyd/vmls/


Isn't there a Python library that abstracts most of it away with a couple of gigantic classes with incompatible dependencies?


The bar for entry is surprisingly low, you just need to brush up on intro abstract algebra. I recommend the following:

1. Kaye, LaFlamme, and Mosca - An Introduction to Quantum Computing

2. Nielsen and Chuang - Quantum Computation and Quantum Information (The Standard reference source)

3. Andrew Childs's notes here [1]. Closest to the state-of-the-art, at least circa ~3 years ago.

[1] - https://www.cs.umd.edu/~amchilds/qa/


specifically avoid resources written by and for physicists.

the model of quantum mechanics, if you can afford to ignore any real-world physical system and just deal with abstract |0>, |1> qubits, is relatively easy. (this is really funny given how incredibly difficult actual quantum physics can be.)

you have to learn basic linear algebra with complex numbers (can safely ignore anything really gnarly).

then you learn how to express Boolean circuits in terms of different matrix multiplications, to capture classical computation in this model. This should be pretty easy if you have a software engineer's grasp of Boolean logic.

Then you can learn basic ideas about entanglement, and a few of the weird quantum tricks that make algorithms like Shor and Grover search work. Shor's algorithm may be a little mathematically tough.

realistically you probably will never need to know how to program a quantum computer even if they become practical and successful. applications are powerful but very limited.

"What You Shouldn't Know About Quantum Computers" is a good non-mathematical read.

https://arxiv.org/abs/2405.15838


I recommend this book I studied it in Undergrad and I never took a quantum theory course. https://www.amazon.com/Quantum-Computing-Computer-Scientists...


Are there any insights that you can give based off the info you've learned about quantum computation that you might not have been able to reach if you hadn't learned about it?

From my __very__ shallow understanding, because all of the efficiency increases are in very specific areas, it might not be useful for the average computer science interested individual?


Nearly all of quantum computation is theoretical algorithms and the hard engineering problems haven't been solved. Most of the math though has a large amount of overlap of AI / ML and all of deep learning to the point that Quantum computers could be used as "ML accelerators" by using algorithms (this is called Quantum Machine learning) [1]. Quantum computing could be learned with a limited understanding of Quantum theory unless you are trying to engineer the hardware.

https://en.wikipedia.org/wiki/Quantum_machine_learning


Possibly of interest, but I wrote a (hopefully approachable) report on quantum perceptrons a few years back [1]. Perhaps it's found elsewhere, but I was surprised by how, at least in this quantum algo's case, the basis of training was game theoretic not gradient descent!

[1] - https://kvathupo.github.io/cs/quantum/457_Final_Report.pdf


The simplest algorithm to understand is probably Grover's algorithm. Knowing that shows you how to get an sqrt(N) speedup on many classical algorithms. Then have a look at shor's algorithm which is the classic factoring algorithm.

I would not worry about hardware at first. But if you are interested and like physics, the simplest to understand are linear optical quantum circuits. These use components which may be familiar from high school or undergraduate physics. The catch is that the space (and component count) is exponential in the number of qubits, hence the need for more exotic designs.


I always recommend Watrous's lecture notes: https://cs.uwaterloo.ca/~watrous/QC-notes/QC-notes.pdf

I prefer his explanation to most other explanations because he starts, right away, with an analogy to ordinary probabilities. It's easy to understand how linear algebra is related to probability (a random combination of two outcomes is described by linearly combining them), so the fact that we represent random states by vectors is not surprising at all. His explanation of the Dirac bra-ket notation is also extremely well executed. My only quibble is that he doesn't introduce density matrices (which in my mind are the correct way to understand quantum states) until halfway through the notes.


There is a course mentioned in the article, but I'm not clear on how "theory" it is.

https://coursera.org/learn/quantum-error-correction


Might be worth checking out: https://quantum.country/


If you want to learn about what theoretical quantum computers might be able to do faster than classical ones and what they might not, you can try to read about quantum complexity theory, or some of what Scott Aaronson puts out on his blog if you don't want to go that in depth.

But the key thing to know about quantum computing is that it is all about the mathematical properties of quantum physics, such as the way complex probabilities work.


These lessons might be of help: https://youtu.be/3-c4xJa7Flk?si=krrpXMKh3X5ktrzT


First learn about eigenvalues.


How long until this can derive a private key from its public key in the cryptocurrency space? Is this an existential threat to crypto?


Long enough you don't need to panic or worry.

Short enough that its reasonable to start r&d efforts on post quantum crypto.


Is there a way to fork and add a quantum-proof encryption layer on the existing cryptocurrency paradigm i.e. Bitcoin 2.0?


You could replace ECDSA with a post quantum algorithm. Keep in mind that many crypto primitives are safe, so there are large parts of bitcoin where you don't have to do anything. Digital signatures is going to be where the main problem is for bitcoin. But thinks like the hash algorithm should be fine (at most quantum gives a square root speed up for hashing, which isn't enough to be really concerning).

One thing that might be problematic for a blockchain where everything has to go on the blockchain forever is that some post quantum schemes have really large signatures or key sizes.

I'm not that familiar with the details of bitcoin, but i had the impression that p2pkh is more secure against quantum computers.

[I should emphasize, im not a cryptographer and only somewhat familiar with bitcoin]


I had the same question and this article was really helpful in explaining the threat models

https://www.deloitte.com/nl/en/services/risk-advisory/perspe...


This is the same as climate change. Something might happen sometime in the future.

Data security okay. But AI? How will that change?


Aren't quantum computers expected to be like digitally read analog computers for high dimension optimization problems, and AI is like massive high dimension optimization problems?


Google is betting on digital quantum computers.

There are, however, analog quantum computers, e.g. by Pasqal, which hope to capitalize on this to optimize AI-like high dimension optimization problems.


But aren't qubits sort of analog? I thought the bits have figurative angles of rotation or something continuous that would resonate together and/or cancel out like neurons in fully-connected parts of neural nets that would be read out at the end of the pipeline.


Why do quantum computers need to be analog to be applied to such problems?


They don't, but if I interpreted the post above correctly analog QCs aremuch easier to build

At least as far as I'm aware by digital they probably mean a generally programmable QC, whereas another approach is to encode a specific class of problems in the physical structure of an analog QC so that it solves those problems much faster than classical. This latter approach is less general (so for instance you won't use it to factor primes) but much more attainable. I think D-wave or someone like that already had commercial application for optimization problems (either traveling salesman or something to do with port organization)


AI is essentially search. Quantum computers are really good at search.


Unstructured search is only a √n improvement. You need to find some way for algorithmically significant interference/cancellation of terms in order for a qc to potentially (!) have any benefit.


Search of what?


Anything. Everything. In domains where the search space is small enough to physically enumerate and store or evaluate every option, search is commonly understood as a process solved by simple algorithms. In domains where the search space is too large to physically realize or index, search becomes "intelligence."

E.g. winning at Chess or Go (traditional AI domains) is searching through the space of possible game states to find a most-likely-to-win path.

E.g. an LLM chat application is searching through possible responses to find one which best correlates with expected answer to the prompt.

With Grover's algorithm, quantum computers let you find an answer in any disordered search space with O(sqrt(N)) operations instead of O(N). That's potentially applicable to many AI domains.

But if you're so narrow minded as to only consider connectionist / neural network algorithms as "AI", then you may be interested to know that quantum linear algebra is a thing too: https://en.wikipedia.org/wiki/HHL_algorithm


Grover's algorithm is useful for very few things in practice, because for most problems we have a better technique than checking sqrt(N) of all possible solutions, at least heuristicly.

There is, at present, no quantum algorithm which looks like it would beat the state of the art on Chess, Go, or NP-complete problems in general.


O(sqrt(N)) is easily dominated by the relative ease of constructing much bigger classical computers though.


Uh, no? Not for large N.

There are about 2^152 possible legal chess states. You cannot build a classical computer large enough to compute that many states. Cryptography is generally considered secure when it involves a search space of only 2^100 states.

But you could build a computer to search though sqrt(2^152) = 2^76 states. I mean it'd be big--that's on the order of total global storage capacity. But not "bigger than the universe" big.


Doing 2^76 iterations is huge. That's a trillion operations a second for two and a half thousand years if I've not slipped up and missed a power of ten.


Maybe 100 years from now we can do 2^18 quantum ops/sec and solve chess in a day, whereas a classical computer could do 2^36 ops/sec and still take longer than the lifetime of the universe to complete.


Google's SHA-1 collision took 2^63.1 hash operations to find. Given that a single hash operation takes more than 1000 cycles, that's only less than three doublings away.

Cryptographers worry about big numbers. 2^80 is not considered secure.


It's early so I'm thinking out loud here but I don't think the algorithm scales like this, does it?

We're talking about something that can search a list of size N in sqrt(N) iterations. Splitting the problem in two doesn't halve the compute required for each half. If you had to search 100 items on one machine it's taken 10x iterations but split over two it'd take ~7x on each or ~14 in total.


If an algorithm has a complexity class of O(sqrt(N)), by definition it means that it can do better if run on all 100 elements than by splitting the list into two elements and running it on each 50.

This is not at all a surprising property. The same things happens with binary search: it has complexity O(log(N)), which means that running it on a list of size 1024 will take about 10 operations, but running it in parallel on two lists of size 512 will take 2 * 9 operations = 18.

This is actually easy to intuit when it comes to search problems: the element you're looking for is either in the first half of the list or in the second half, it can't be in both. So, if you are searching for it in parallel in both halves, you'll have to do extra work that just wasn't necessary (unless your algorithm is to look at every element in order, in which case it's the same).

In the case of binary search, with the very first comparison, you can already tell in which half of the list your element is: searching the other half is pointless. In the case of Grober's algorithm, the mechanism is much more complex, but the basic point is similar: Grover's algorithm has a way to just not look at certain elements of the list, so splitting the list in half creates more work overall.


That only helps for a relative small range of N. Chess happens to sort of fit into this space. Go is way out, even a sqrt(N) is still in the "galaxy-sized computer" range. So again, there are few problems for which Grover's algorithms really takes us from practically uncomputable to computable.

Even for chess, 2^76 operations is still waaaaay more time than anyone will ever wait for a computation to finish, even if we assumed quantum computers could reach the OPS of today's best classical computers.


No-one would solve chess by checking every possible legal chess state -- also checking 'all the states' wouldn't solve chess, you need a sequence of moves, and that pushes you up to an even bigger number. But again, you can easily massively prune that, as many moves are forced, or you can check you are in a provable end-game situation.



training an ai model is essentially searching for parameters that can make a function really accurate at making predictions. in the case of LLMs, they predict text.


They showed a logical qubit that can last entangled for an hour, but to do that they had to combine their hundred or so physical qubits into a single one, so in some sense they have, right now, a single (logical) qubit


> AI and data security will be unalterably different if so

Definitely agree with the latter, but do you have any sources on how quantum comphters make "AI" (i.e. matrix multiplication) faster?


Exploring via Search can become O(1) instead of M^N


That is definitely not true.


> AI and data security will be unalterably different if so

So what are the implications if so ?


> Worth spending a little time doing some long tail strategizing I’d say.

What do you mean by this?


"long tail" typically refers to the tail of a normal distribution - basically it's a sciencey, but common, way of saying "very unlikely event". So, the OP was saying that it's worth spending some time strategizing about the unlikely event that a practical RSA-breaking QC appears in the near future, even though it's still a "long tail" (very unlikely) event.

Honestly, there's not that much to discuss on this though. The only things you can do from this strategizing is to consider even encrypted data as not safe to store, unless you're using quantum resistant encryption such as AES; and to budget time for switching to PQC as it becomes available.


The only thing I know or understand about quantum computing is its ability to "crack" traditional encryption algorithms.

So the commenter is saying that Cybersecurity needs to be planning for a near-world where traditional cryptography, including lots of existing data at rest, is suddenly as insecure as plaintext.


I think some element of it might be: Shor’s algorithm has been known of for 30 years, and hypothetically could be used to decrypt captured communications, right? So, retroactively I will have been dumb for not having switched to a quantum-resistant scheme. And, dumb in a way that a bunch of academic nerds have been pointing out for decades.

That level of embarrassment is frankly difficult to face. And it would be devastating to the self-image of a bunch of “practical” security gurus.

Therefore any progress must be an illusion. In the real world, the threats are predictable and mistakes don’t slowly snowball into a crisis. See also, infrastructure.


What would you switch to? There hasn’t been post quantum systems to use until very very recently.


All you encrypted communication from the 90s (SSL anyways) can probably be decrypted with classical means anyways. 90s SSL was pretty bad.


Edit after skimming arxiv preprint[1]:

Yeah, this is pretty huge. They achieved the result with surface codes, which are general ECCs. The repetition code was used to further probe quantum ECC floor. "Just POC" likely doesn't do it justice.

(Original comment):

Also quantum dabbler (coincidentally dabbled in bitflip quantum error correction research). Skimmed the post/research blog. I believe the key point is the scaling of error correction via repetition codes, would love someone else's viewpoint.

Slightly concerning quote[2]:

"""

By running experiments with repetition codes and ignoring other error types, we achieve lower encoded error rates while employing many of the same error correction principles as the surface code. The repetition code acts as an advance scout for checking whether error correction will work all the way down to the near-perfect encoded error rates we’ll ultimately need.

"""

I'm getting the feeling that this is more about proof-of-concept, rather than near-practicality, but this is certainly one fantastic POC if true.

[1]: https://arxiv.org/abs/2408.13687

[2]: https://research.google/blog/making-quantum-error-correction...

Relevant quote from preprint (end of section 1, sorry for copy-paste artifacts):

"""

In this work, we realize surface codes operating below threshold on two superconducting processors. Using a 72-qubit processor, we implement a distance-5 surface code operating with an integrated real-time decoder. In addition, using a 105-qubit processor with similar performance, we realize a distance-7 surface code. These processors demonstrate Λ > 2 up to distance-5 and distance7, respectively. Our distance-5 quantum memories are beyond break-even, with distance-7 preserving quantum information for more than twice as long as its best constituent physical qubit. To identify possible logical error f loors, we also implement high-distance repetition codes on the 72-qubit processor, with error rates that are dominated by correlated error events occurring once an hour. These errors, whose origins are not yet understood, set a current error floor of 10−10. Finally, we show that we can maintain below-threshold operation on the 72qubit processor even when decoding in real time, meeting the strict timing requirements imposed by the processor’s fast 1.1µs cycle duration.

"""


You got the main idea, it's a proof-of-concept: that a class of error-correcting code on real physical quantum chips obey the threshold theorem, as is expected based on theory and simulations.

However the main scaling of error correction is via surface codes, not repetition codes. It's an important point as surface codes correct all Pauli errors, not just either bit-flips or phase-flips.

They use repetition codes as a diagnostic method in this paper more than anything, it is not the main result.

In particular, I interpret the quote you used as: "We want to scale surface codes even more, and if we were able to do the same scaling with surface codes as we are able to do with repetition codes, then this is the behaviour we would expect."

Edit: Welp, saw your edit, you came to the same conclusion yourself in the time it took me to write my comment.


Haha, classic race condition, but I appreciate your take nonetheless!


Google could put themselves and everyone else out of business if the algorithms that underpin our ability to do e-commerce and financial transactions can be defeated.

Goodbye not just to Bitcoin, but also Visa, Stripe, Amazon shopping, ...


bitcoin proof of work is not as impacted by quantum computers - grover's algorithm provides a quadratic speedup for unstructured search - so SHA256 ends up with 128 bits of security for pre-image resistance. BTC can easily move to SHA512.

symmetric ciphers would have similar properties (AES, CHACHA20). Asymmetric encryption atm would use ECDH (which breaks) to generate a key for use with symmetric ciphers - Kyber provides a PQC KEM for this.

So, the situation isn't as bad. We're well positioned in cryptography to handle a PQC world.



It seems you can get TLS 1.3 (or atlest slighty modified 1.3) to be quantum secure, but it increases the handshake size by roughly 9x. Cloudflare unfortunately didn't mention much about the other downsides though.

https://blog.cloudflare.com/kemtls-post-quantum-tls-without-...


About one third of traffic with Cloudflare is already using post-quantum encryption. https://x.com/bwesterb/status/1866459174697050145

Signatures still have to be upgraded, but that's more difficult. We're working on it. http://blog.cloudflare.com/pq-2024/#migrating-the-internet-t...


Yes-ish. They're not enabled yet, but post-quantum signatures & KEMs are available in some experimental versions of TLS. None are yet standardized, but I'd expect a final version well before QCs can actually break practical signatures or key exchanges.


One third of all human traffic with Cloudflare is using a post-quantum KEM. I'd say that counts as enabled. We want that to be 100% of course. Chrome (and derivates) enabled PQ by default. https://radar.cloudflare.com/adoption-and-usage


It's currently believed that quantum computers cannot break all forms of public key cryptography. Lattice based cryptography is a proposed replacement to RSA that would let us keeping buying things online no problem.


Why is no one else talking about this? I came here to see a discussion about this and encryption.


Because this result is still very far from anything related to practical decryption.


And if they were, would they tell the world?


If they had a QC that could run Shor's algorithm to factor the number 1000, I'd guarantee you they'd tell the whole world. And it would still be a long, long time from there to having a QC that can factor 2048-bit numbers.


> Worth spending a little time doing some long tail strategizing I’d say.

Yup, like Bitcoin going to zero.


I'm a little more in my wheelhouse here -- without an algo change, Grover's algorithm would privilege quantum miners significantly, but not any more than the industry has seen in the last 13 years (C code on CPU -> GPU -> Large Geometry ASIC -> Small Geometry ASIC are similarly large shifts in economics for miners probably).

As to faking signatures and, e.g. stealing Satoshi's coins or just fucking up the network with fake transactions that verify, there is some concern and there are some attack vectors that work well if you have a large, fast quantum computer and want to ninja in. Essentially you need something that can crack a 256 bit ECDSA key before a block that includes a recently released public key can be inverted. That's definitely out of the reach of anyone right now, much less persistent threat actors, much less hacker hobbyists.

But it won't always be. The current state of the art plan would be to transition to a quantum-resistant UTXO format, and I would imagine, knowing how Bitcoin has managed itself so far, that will be a well-considered, very safe, multi-year process, and it will happen with plenty of time.


You fool! And I say that affectionately. Another fool says: the security of Bitcoin relies on the inability to (among other things) derive a private key from a public key. This is just basic cryptography, like Turning vs enigma. This machine can "calculate" solutions to problems in time frames that break the whole way that cryptocurrency works. You better believe that what we hear about is old. These types of systems, and there must be non-public versions, could solve a private key from a public key, in easy less than O(fu) time.

EDIT: it's like rainbow hashes, but every possible variation is a color, not granular like binary, but all and any are included.


I think you’re going to need about 10,000,000 qbits to divert a transaction, but that’s still within foreseeable scale. I think it’s extreme likely that the foundation will have finished their quantum resistance planning before we get to 10MM coherent qbits, but still, it’s a potential scenario.

More likely that other critical infrastructure failures will happen within trad-finance, much larger vulnerability footprint, and being able to trivially reverse engineer every logged SSL session is likely to be a much more impactful turn of events. I’d venture that there are significant ear-on-the-wire efforts going on right now in anticipation of a reasonable bulk SSL de cloaking solution. Right now we think it doesn’t matter who can see our “secure” traffic. I think that is going to change, retroactively, in a big way.


I agree that the scary scenario is stored SSL frames from 20 years of banking. That's nuclear meltdown scenarios.


To do what? Replay? Just curious on an attack vector.


Hopefully replay attacks will not be useful, but confidential information will be abundant. There will be actionable information mixed in there , and it will be a lot of data. Just imagine if everything whispered suddenly became shouted.


It could be true. But pearls are rare and storage is expensive.


It is.. and I don’t see a way to avoid it.


> Yup, like Bitcoin going to zero.

If the encryption on Bitcoin is broken, say goodbye to the banking system.


[pedantic hat on] Bitcoin doesn't use encryption.

You mean digital signatures - and yes, we use signatures everywhere in public key cryptography.


not really. banking systems have firewalls and access controls. quantum computations would be useless.


Those don't really mean anything when an attacker can eavesdrop on customer and employee comms and possibly redirect transactions (MITM).


Banking communications and transactions will all be protected by quantum-resistant protocols and ciphers well before that will become a problem. Most of these already exist, and some of them can even be deployed.


Bitcoin will just fork to a quantum proof encryption scheme and there will be something called "bitcoin classic" that is the old protocol (which few would care about)


last time they did that people stuck with bitcoin classic instead of the larger block size variant of bitcoin which today is known as bitcoin-cash.


eh, they will add a quantum-resistant signature scheme (already a well-understood thing) then people can transfer their funds to the new addresses before it is viable to crack the existing addresses


So the first company that can break bitcoin addresses using quantum computers gets a prize of how many billion(?) dollars by stealing all the non-migrated addresses.

Is that a crime? Lots of forgotten keys in there.


A very interesting philosophical and moral can of worms you just opened there. Bitcoin is governed by the protocol, so if the protocol permits anyone who can sign a valid transaction involving a given UTXO to another address, then it technically isn't a "crime". Morally I'm not sure I'd be able to sleep well at night if I unilaterally took what I didn't exchange value for.

As for the forgotten key case, I think the only way to prove you had the key at some point would need to involve the sender vouching for you and cryptographically proving they were the sender.


Morally, there is no quandary: it's obviously morally wrong to take someone else's things, and knowing their private key changes nothing.

Legally, the situation is the same: legal ownership is not in any way tied to the mechanism of how some system or another keeps track of ownership. Your BTC is yours via a contract, not because the BTC network says so. Of course, proving to a judge that someone else stole your BTC may be extremely hard, if not impossible.

Saying "if the protocol permits anyone who can sign a valid transaction involving a given UTXO to another address, then it technically isn't a "crime"" is like saying "traditional banking is governed by a banker checking your identity, so if someone can convince the banker they are you, then it technically isn't a "crime"".

The only thing that wouldn't be considered a crime, in both cases, is the system allowing the transaction to happen. That is, it's not a crime for the bank teller to give your money to someone else if they were legitimately fooled; and it's not a crime for the Bitcoin miners to give your money to someone else if that someone else impersonated your private key. But the person who fooled the bank teller /the miners is definitely committing a crime.


Traditional banking is governed by men with guns who depend on votes (for appearances). They always have recourse and motivation to intervene with private transactions. Not so much the case with bitcoin, which is extralegal for the most part and doesn't depend on them.


https://en.wikipedia.org/wiki/BGP_hijacking#Public_incidents

A long-term tactic of our adversaries is to capture network traffic for later decryption. The secrets in the mass of packets China assumedly has in storage, waiting for quantum tech, is a treasure trove that could lead to crucial state, corporate, and financial secrets being used against us or made public.

AI being able to leverage quantum processing power is a threat we can't even fathom right now.

Our world is going to change.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: