Hacker News new | past | comments | ask | show | jobs | submit login
Quantum is unimportant to post-quantum (trailofbits.com)
110 points by woodruffw 4 months ago | hide | past | favorite | 67 comments



> But even if a quantum computer is never built, new PQ standards are safer, more resilient, and more flexible than their classical counterparts.

lol wat? Nothing could be further from the truth. Just a few years ago, two of the four finalists of the NIST PQC were broken so badly that you could break production key sizes in hours with a 10 year old Mac mini. Most PQC deployments aren’t broken because they double up with classical crypto systems so that you have to break both, but people who were using these schemes were basically wasting their time.

Key sizes and signatures are typically much, much larger than classical cryptosystems.

PQC systems lack the flexibility of classical cryptography. Generally speaking the linear structure of classical schemes is what quantum computers exploit, and so it is gone from most (all?) PQC systems. That linear structure is what lets you do things like make a 2-of-2 by adding pubkeys and signatures. I’m not sure what is meant by flexibility of not stuff like this.


> Just a few years ago, two of the four finalists of the NIST PQC were broken so badly that you could break production key sizes in hours with a 10 year old Mac mini.

SIKE was not a Finalist, it was a Round 4 candidate.

Candidates being broken before the process finishes is a sign of the process working.


https://cacm.acm.org/news/nist-post-quantum-cryptography-can...

"Belgian researchers have cracked the SIKE cryptographic algorithm, a fourth and final-round candidate that the U.S. National Institute of Standards and Technology (NIST) was evaluating for its Post-Quantum Cryptography (PQC) standard."

SIKE was a finalist, it entered final round.


SIKE was probably the most exciting PQC algorithm available, if it hadn't been broken. It made it to the finals for a relatively brief time.

CRYSTALS also now has a possible attack proposed by a recent paper on a similar algorithm, so time will tell if any more of these algorithms is broken.


But SIKE was also broken in, like, a theatrical fashion. It was an explosive break!

It feels like there's some dancing around happening here, where the subtext is that where an algorithm is in a NIST contest process isn't determinative on it's own, and that there are external reasons we tend to trust the lattice stuff (because it's been studied for decades, &c).


Even with lattice stuff, the parameters keep having to be changed to ensure security. It may not be explosive breaks, but the cryptanalysis is advancing at stunning speeds. This stuff is no where near prime time ready.


I think we are agreeing, directionally.

I don't have a strong opinion in either direction and wouldn't be qualified to express one. I'm a vulnerability researcher with a little bit of cryptography to me, so I tend always to err on the side of utterly skeeved out. I assume, regardless of what national standards bodies recommend, this stuff will all be run hybrid for the foreseeable future, but who knows?

I don't think anybody can easily shut down the argument that it's too early for lattice PQC to simply replace ECDLP cryptography.


There is not any reason to NOT run hybrid cryptography schemes right now, when the use case allows for it.

I do worry though that early standardization might lock us into a bad standard, and current PQC systems are too inflexible to allow for many use cases. The latter point means that if you want PQC, you might have to give up some features. E.g. multikey signature schemes, zero knowledge proofs, etc.


There is a very good reason: efficiency. Do you want to have servers carry an extra megabyte of otherwise-useless key material for each incoming connection? What about the extra ~100-1000 microseconds of CPU time for each key exchange?

Now multiply that by 100,000 live connections and 10,000 per second, and you'll see that there's a big problem here.

A lot of cryptography arguments forget that people don't place unlimited value on cryptography (or even security in a general sense). These algorithms are costly, and if they do nothing, should not be used.


Kyber768, which is what I see most people using for hybrid PQC deployments right now, has 1kB keys and is actually faster than elliptic curve cryptography.


4 kB keys, you mean, right? You have to store your private key, too. It's pretty fast otherwise, yes.


You have to store 1 private key. You store/transmit/interact with millions of other public keys, so the concern over size is somewhat asymmetrical.


Many protocols use an ephemeral private key, and protocols like TLS that don't today are moving toward ephemeral private keys faster than they are toward PQC.

The ephemeral key gets signed with your long-lived private key, but that's all your long-lived key is used for.


> There is not any reason to NOT run hybrid cryptography schemes right now, when the use case allows for it.

This is reasonable, but runs contrary to the stance taken by CNSA 2.0.


Gee, I wonder why.


Yeah the break of SIKE was pretty spectacular. The attacks on lattice-based crypto are less so. Still, the math is a lot more complicated than ECC or RSA, so it's plausible that it's also fundamentally broken, but we just haven't found out how.

At this point the only PQC algorithm I would trust with my data on its own is good old McEliece.


I'm not as convinced that the math on RSA is so simple compared to LWE cryptography. You'll see reputable cryptographers (well, one of them at least) confidently opining how we'll never break SHA2 and AES. I've never seen that for RSA or any ECDLP construction.

SIKE, sure, all bets are off.


The math for RSA is dead simple. But the theoretical foundations are not well established, no. It should be said though that SHA2 and AES also have unproven foundations for their security model. In a very strict sense both RSA and AES are equally vulnerable. But I think people would be more surprised if AES was catastrophically broken, and most surprised if SHA2 were, though there is little more than mathematical intuition underlying this.


> But even if a quantum computer is never built, new PQ standards are safer, more resilient, and more flexible than their classical counterparts.

I disagree.

First of all they are far more inconvenient. They keynotes and signature sizes are bigger. With Curve25519 ECC we can have 256 bit keys.

Secondly, flexibility is mentioned, but I think the last serval years have shown that flexibility is huge vulnerability in crypto systems.

Finally, I feel pretty confident that the NSA doesn’t know too much more than we do about RSA or ECC which have been studied, studied, implemented, and analyzed for decades in the open. I worry with Post-Quantum algorithms, that there is a good chance that the NSA knows far more about them than does the public cryptography community.


> Secondly, flexibility is mentioned, but I think the last serval years have shown that flexibility is huge vulnerability in crypto systems.

All the criticism of cryptographic agility that I have seen has involved an attacker negotiating a downgrade to a broken protocol. But if the protocol is not yet broken, then being agile isn't a concern, and if/when the protocol does become broken, then you can remove support for the broken protocol, which is what you'd be forced to do anyway, so a flexible approach just seems like a more gradual way to achieve that future transition.

Being distrustful of untested post-quantum algorithms is fair. But you can always double-encrypt a message with both classical and post-quantum algorithms, which requires an attacker to break both of them in order to decipher it. This guards against both short-term algorithmic weaknesses in our nascent post-quantum algorithms, as well as long-term threats to classical algorithms from hypothetical quantum computers.

Key sizes are larger, yes, but not show-stoppingly large; on par with secure RSA key sizes, which is tolerable. Kilobytes, not megabytes.


> All the criticism of cryptographic agility that I have seen has involved an attacker negotiating a downgrade to a broken protocol.

Consider this an additional data point, then: https://paragonie.com/blog/2019/10/against-agility-in-crypto...

In the years since I wrote that, several people have pointed out that "versioned protocols" are just a safe form of "crypto agility". However, when people say "crypto agility', they usually mean something like what JWT does.

What JWT does is stupid, and has caused a lot of issues: https://www.howmanydayssinceajwtalgnonevuln.com/

If you want to use JWT securely, you have to go out of your way to do so: https://scottarc.blog/2023/09/06/how-to-write-a-secure-jwt-l...

> But if the protocol is not yet broken, then being agile isn't a concern, and if/when the protocol does become broken, then you can remove support for the broken protocol, which is what you'd be forced to do anyway, so a flexible approach just seems like a more gradual way to achieve that future transition.

This makes sense in situations where you have versioned protocols :)

This doesn't work if you're required to support RSA with PKCS1v1.5 padding until the heat death of the universe.


Hmmm, some recent protocols (thinking of MLS[1] here) have moved into a middle territory where they have a lot of options for piecing together a cryptographic suite, but then version that whole suite within the protocol. You can still change suites without changing the protocol, but it's not nearly as 'agile' as earlier suite and primitive negotiations.

Maybe something more like "cryptographic mobility" instead of "agility"? You can carefully decamp and move from one suite (versioned protocol) to another without changing all your software, but you're not negotiating algorithms and semantics on the fly.

[1] https://datatracker.ietf.org/doc/rfc9420


The problem is that the broken protocol is frequently not removed. It is shoved to the end of the list, where it remains available, but that's "good enough" because surely it won't actually get used.


Sure, but the same concern exists even for applications that have inflexible protocols. You either have someone willing and able to say that enough is enough and break backwards compatibility, or you don't.


Well, I think one of the problems with ssl3 is nobody really knew who was using it on purpose, and even the people using it didn't know. With something like WireGuard 2, you will presumably figure out pretty quickly which endpoints can't communicate if they're still on wg1. At least that's something.


I guess we then have to bet… Is it more likely that NSA has a secret large-scale quantum computer, or that they have a secret vulnerability in post-quantum algorithms? And does it matter anyway if they have a secret backdoor in your OS :)


Yeah I feel like the article glosses over the fact that PQ is not about crypto flexibility but specific crypto flexibility that is immune to a hypothetical QC and a lot is sacrificed to achieve that goal (namely signature sizes and/or in some cases performance).

And no, you shouldn’t be using RSA but they completely gloss over that we have very very good ECC standards that in no way seem to be suffering the same problem as RSA (ie needing to regularly increase the key size) which is why afaict everyone has pretty much switched to ECC signatures unless they have some legacy reason.


For a long time I wondered why there was such a big push for PQ even though there was no quantum computer and a reasonably working one was always 15 years in the future.

… or was there a quantum computer somewhere and it was just kept hush hush, hence the push for PQ?

The answer turns out to be: it doesn’t matter if there is a quantum computer! The set of PQ algorithms has many other beneficial properties besides quantum resistance.


The point is that a lot of secrets need to remain secrets for many years. If some government found a way to break elliptic curve in the same way that the number field seive broke rsa (hence we now need 2048-bit keys rather than 256bit keys we were using in the 90s) we’d be fucked for many years to come as all secrets are leaked.

So there may not be quantum computers now. But if there’s going to be in 20years we need our crypto to be resilient now.


Because SNDL is a long planned attack for secrets that have a long life time.

https://en.wikipedia.org/wiki/Harvest_now,_decrypt_later


I’m a physicist working on QC. I know we actually don’t know if a “secret” QC exists somewhere, but given that major theoretical and engineering breakthroughs are needed to build a fault tolerant one (and all QC companies are facing this regardless of whether their qubits optical, superconducting, trapped ions etc), I’d put that possibility to near zero. Consider also the talent and expertise that is needed for such an endeavour…


There are pathways to quantum computers that are intrinsically highly fault tolerant: https://sqc.com.au/news/


Very nice! I hope you folks will go far. Best of luck :)

That said, the entire field is still so far from a seriously useful QC that I still wouldn’t bet there’s a secret one somewhere in some government lab. Those are my two cents, and I may be wrong of course.


I’m not claiming there is. There might be, but I find it unlikely. When the NSA develops practical QC systems, a lot of QC research will suddenly go quiet. That hasn’t happened.

There is a viable pathway to low error rate, scalable quantum computers on a less than 10 year time horizon though.


50 million in a series A for a cold-fusion-like technology in Australia?? What’s going on here? Did they discover something groundbreaking?


There is a long history of this technology, and the comparison to cold fusion is unwarranted. This is peer reviewed, accepted science. The basic technique was worked out under a DoE study in Texas, with an Australian collaborator. She (Dr. Michelle Simmons, who is widely respected in this field) then Went out and raised money to scale-up.

The basic idea is that they use scanning probe microscopes to create structures on a silicon surface with atomic precision, which can then be manipulated by the surrounding chip as a solid-state qubit. You still need error correction, but it ends up being a small constant factor rather than combinatorial blowup.

Full disclosure: I’m fundraising a startup to pursue a different manufacturing process that would enable the same type of quantum computer, but with nitrogen vacancies in diamond instead of silicon (and therefore still higher reliability).

One way or the other, highly reliable quantum computers are right around the corner and are going to take a lot of people by surprise.


>This is peer reviewed, accepted science.

This is also something that people outside academia apparently don't understand. Peer review doesn't tell you anything about the validity of the science. It only ensures the methodology was correct. The original Pons & Fleischmann paper passed peer review and was published in the Journal of Electroanalytical Chemistry. It only got retracted after other people tried and failed to reproduce it. If you want to know whether science is legit or not, look out for reproduction by independent actors - not peer review.


Indeed. Peer review is table stakes for the conversation, not an acceptance criteria for "true". Plenty of things get published that are generally regarded as wrong by those who work in the field.


There's journal peer review, and then there's scientific community peer review which involves acceptance of ideas and replication. They're not the same thing and unfortunately not often distinguished in speech or writing ("peer review" describes both). I thought that on HN it would be clear I was talking about the latter.

In this case, three separate labs have replicated this work. It's solid.


Peer review in fundamental science is almost universally understood straightforwardly as part of the process of publishing said science. The other kinds you are referring to (there's actually more than one) are more common in other fields. Peer review in physics is very far from acceptance in general.


Maybe, but that’s a very recent redefinition of terms. Peer review as a standardized mechanism in the 70’s - 90’s depending on the field. Until very close to the present saying “passing peer review” meant something akin to the Popperian notion of “ideas that survive attempts at falsification by the broader community of scientific peers.” In all my interaction with academia pre-pandemic, it meant precisely this. Something wasn’t peer reviewed because it was published (surviving the editorial process), but because it was published and cited by later works without credible refutations emerging.


I don't know anything more than is written in this press release, but there's precedent: https://www.innovationaus.com/inside-run-psiquantum-given-a-...

> California-based startup PsiQuantum was given an “inside run” to a controversial $1 billion investment by Australian taxpayers as the only company that government engaged with in a thorough due diligence process.


It looks like that was gov also. The figures and the investment is weird in general.

Which tells you one thing…


And many problems, namely, enormous keys and signatures that make PKI nigh impossible for the embedded/IoT space.


According to the linked post there are PQ algorithms that will fit this niche:

> This variety of different trade-offs gives developers a lot of flexibility. For an embedded device where speed and bandwidth are important but ROM space is cheap, McEliece might be a great option for key establishment. For server farms where processor time is cheap but saving a few bytes of network activity on each connection can add up to real savings, NTRUSign might be a good option for signatures. Some algorithms even provide multiple parameter sets to address different needs: SPHINCS+ includes parameter sets for “fast” signatures and “small” signatures at the same security level.


Embedded/IoT is typically slow and small which is not a space PQ fits into.

I also think the article is overly optimistic claiming that ECC is “hard” because of the need for careful curve selection (even though we have very good established curves), but I find it hard to believe that PQ algorithms are immune to parameter selection problems and implementation challenges.


There has been research on the intersection of IoT and PQ signatures specifically at least, e.g. see "Short hash-based signatures for wireless sensor networks" [0] [1]. Unlike SPHINCS+ which is mentioned in the article, if you're happy to keep some state around to remember the last used signature (i.e. you're not concerned about accidental re-use) then the scheme can potentially be _much_ simpler.

[0] https://web.archive.org/web/20110401080052/https://www.cdc.i...

[1] https://news.ycombinator.com/item?id=33925383 I wrote about this "Dahmen-Krauß Hash-Chain Signature Scheme" (DKSS) algorithm previously in a comment a couple of years ago


The state is enormous. Dedicating megabytes and megabytes to key state is painful. And so is tracking state across components and through distribution channels. If you’re not afraid of that then just use symmetric crypto and be done with it.


> use symmetric crypto

To be clear my comment is specifically only relating to signature schemes, not encryption.

> The state is enormous

The scheme I linked to points towards efficient "pebbling" and "hash chain traversal" algorithms which minimize the local state required in quite a fascinating way (e.g. see https://www.win.tue.nl/~berry/pebbling/).

> tracking state across components and through distribution channels

Assuming you have reliable ordering in those channels I don't see how the stateful nature of such schemes makes it hugely more complex than the essential hard problem of key distribution.


The signature size for hash-based algorithms is around 16kb, and can be feasibly reduced to 8kb. The key sizes are around 32 bytes.

Lattice-based algorithms are around 1kb.


Also, we are talking about mitigating a large tangible downside risk to a sudden breakthrough in the space - all the secrets stop being secret. "Reasonable" timeline estimates for how far away we are matter for thinks like if/how much we invest in the tech, but optimistic timelines become pessimistic when defending against downsides and we should be pessimistic when preparing regulations and mitigations


> … or was there a quantum computer somewhere and it was just kept hush hush, hence the push for PQ?

If there were a quantum computer somewhere, or close to one, it would be reasonably likely for it to be secret.

I look at the history of crypto in the mid to late 20th century for example. Small groups in the Allies and the NSA and etc. had certainly more knowledge than was public by a wide margin, years to decades.


By 1990s they were pretty rubbish. DES could be cracked by home PCs for a couple of days.


That's not quite correct. The first (public) brute-forcing of DES was done in 1997 by the DESCHALL project distributing the search across tens of thousands of volunteer's computers for weeks [1]. The EFF then spent $250,000 to build a dedicated DES cracker ("Deep Crack") which required an average of four days per key found [2]

[1] https://en.m.wikipedia.org/wiki/DESCHALL_Project

[2] https://en.m.wikipedia.org/wiki/EFF_DES_cracker


DES itself is an example of the NSA being ahead of the field. The designed the S-box to be resistant against attacks nobody knew about yet.

Like the sibling comment points out, you're overstating the weakness of DES as well.


> These are all special instances of a more general computational problem called the hidden subgroup problem. And quantum computers are good at solving the hidden subgroup problem. They’re really good at it.

I assume they mean the hidden subgroup problem for abelian groups? Later they mention short integer solutions (SIS) and learning with errors (LWE), which by my understanding both rely on the hardness of the shortest vector problem, corresponding to the hidden subgroup problem for some non-abelian groups. I haven't read into this stuff for a while, though


Considering the question of whether classical methods can break the current breed of secure algorithms is still open I see pqc as a hedge against the possibility that p=np.


I don't know why this was downvoted. Even if P!=NP there are still other assumptions of hardness baked into modern cryptography which might turn out false. The abelian hidden subgroup problem (which both RSA and elliptic curves are instances) may turn out to have a classical solution.


If you think P=NP is possible then you might as well think BPP=BQP as well.


"Of course, one big concern is that everybody is trying to standardize cryptosystems that are relatively young. What if the industry (or NIST) picks something that’s not secure? What if they pick something that will break tomorrow?"

If information remains interesting to an adversary long-term, they can always archive classically ciphered text and apply future hardware and algorithmic advances to cracking it.

This is why "post-quantum" cryptography may well be "quantum cryptography". QC must be broken at the time of transmission for an adversary to obtain any information at all. If you're trying to communicate something that will remain sensitive long-term, with QC you aren't betting against the future producing something surprising.

QC already works, it's getting cheaper and faster, and more network friendly. It's not ready for the internet yet, but it's getting there. We don't need it for information that changes every few years, like credit card info, but that's not all people use cryptography for, even today.


Quantum key distribution is completely unsuitable to replace asymmetric cryptography, even when ignoring the (enormous) technological difficulties. It fundamentally cannot handle authentication, and requires a pre-shared key to prevent MITM-attacks, so one can replace it with any modern stream cipher (which are astronomically more econonical and don't require physical point-to-point connections). Even exchanging an SSD with a long key and using a one-time pad is far superior to QKD.


Don't knock sending around gigantic one-time pads on SSDs until you try it :)

In all seriousness, QKD also has huge SNR problems over long range. QKD is really a layer 1 protocol that needs you to entangle qubits far away from each other, and that relies on having a very clean physical layer. You can sort of do it over short-range fiber right now with a known-to-be-direct connection, but any other medium is still off the table. Once you have done the QKD layer 1, put any protocol you want on top of it.


Can't we sign the key with SPHINCS+ and be back to the same situation as with classical asymmetric cryptography?


QC requires direct contact between sender and receiver (if you have direct contact, why do you care about cryptography at all?); can't be used on stored data; can't prove authenticity; and can inherently be broken by a man in the middle attack.

So, no, it's not a viable replacement for anything.


QC at a very very very limited scale works and it’s not clear that it can be made to scale to solve problems faster than classical.


That sounds very interesting. Where can I read more about QC over internet and how we’re getting closer to it?





Consider applying for YC's W25 batch! Applications are open till Nov 12.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: