Hacker News new | past | comments | ask | show | jobs | submit login

> I remember that some of the suggested changes from NSA shared with IBM were actually stronger against a cryptanalysis attack on DES that was not yet publicly known

So we have that and other examples of NSA apparently strengthening crypto, then we have the dual-EC debacle and some of the info in the Snowden leaks showing that they've tried to weaken it.

I feel like any talk about NSA influence on NIST PQ or other current algorithm development is just speculation unless someone can turn up actual evidence one way or another. I can think of reasons the NSA would try to strengthen it and reasons they might try to weaken it, and they've done both in the past. You can drive yourself nuts constructing infinitely recursive what-if theories.




The NSA wants "NOBUS" (NObody-But-US) backdoors. It is in their interest to make a good show of fixing easily-detected vulnerabilities while keeping their own intentional ones a secret. The fantasy they are trying to sell to politicians is that people can keep secrets from other people but not from the government; that they can make uncrackable safes that still open when presented with a court warrant.

This isn't speculation either; Dual_EC_DRBG and its role as a NOBUS backdoor was part of the Snowden document dump.


Here's the counter-argument that I've seen in cryptography circles:

Dual EC, a PRNG built on an asymmetric crypto template, was kind of a ham fisted and obvious NOBUS back door. The math behind it made such a backdoor entirely plausible.

That's less obvious in other cases.

Take the NIST ECC curves. If they're backdoored it means the NSA knows something about ECC we don't know and haven't discovered in the 20+ years since those curves were developed. It also means the NSA was able to search all ECC curves to find vulnerable curves using 1990s technology. Multiple cryptographers have argued that if this is true we should really consider leaving ECC altogether. It means a significant proportion of ECC curves may be problematic. It means for all we know Curve25519 is a vulnerable curve given the fact that this hypothetical vulnerability is based on math we don't understand.

The same argument could apply to Speck:

https://en.wikipedia.org/wiki/Speck_(cipher)

Speck is incredibly simple with very few places a "mystery constant" or other back door could be hidden. If Speck is backdoored it means the NSA knows something about ARX constructions that we don't know, and we have no idea whether this mystery math also applies to ChaCha or Blake or any of the other popular ARX construction gaining so much usage right now. That means if we (hypothetically) knew for a fact that Speck was backdoored but not how it's backdoored it might make sense to move away from ARX ciphers entirely. It might mean many or all of them are not as secure as we think.


SM2 (Chinese), GOST (Russian) and NIST P (American) parameters are "you'll just have to straight up assume these are something up our sleeve numbers".

ECGDSA/brainpool (German) and ECKCDSA (Korean) standards make an attempt to explain how they chose recommended parameters but at least for brainpool parameters, the justifications fall short.

The DiSSECT[1] project recently published this year is an excellent approach to estimating whether parameters selected (often without justification) are suspicious. GOST parameters were found to be particularly suspicious.

I wonder if a similar project could be viable for assessing parameters of other types of cryptographic algorithms e.g. Rijndael S-box vs. SM4 S-box selection?

[1] https://dissect.crocs.fi.muni.cz/


Interesting link, and yes it does look like the GOST curves are really suspect. I didn't see a graph for the NIST curves and they do not appear to have called them out.

There's a big difference though with the GOST curves. They were generated in what seems to be a 100% opaque manner, meaning they could have been back-calculated from something.

The NIST curves were generated in a way that was verifiably pseudorandom (generation involved a hash of a constant) but the constant was not explained. This makes it effectively impossible to straight-up back-calculate these curves from something else. NIST/NSA would have had to brute force search for parameters giving rise to breakable curves, which is the basis of the reasoning I've seen by cryptographers I quoted above.

Note that the cryptographers I've seen make this argument aren't arguing that the NIST curves could not be suspect. What they're arguing is that if they are in fact vulnerable and were found by brute force search using 90s computers, all of elliptic curve cryptography may be suspect. If we (hypothetically) knew for a fact they were vulnerable but did not know the vulnerability, we'd know that some troubling percentage of ECC curves are vulnerable to something we don't know and would have no way of checking other curves. We'd also have no way of knowing if other ECC constructions like Edwards curves or Koblitz curves are more or less vulnerable.

So the argument is: either the NIST curves are likely okay, or maybe don't use ECC at all.

Bruce Schneier was for a time a proponent of going back to RSA and classical DH but with large (4096+ bit) keys for this reason. RSA has some implementation gotchas but the math is better understood than ECC. Not sure if he still advocates this.

Personally I think the most likely origin of the NIST constants was /dev/urandom. Remember that these were generated back in the 1990s before things like curve rigidity was a popular topic of discussion in cryptography circles. The goal was to get working curves with some desirable properties and that's about it.


That’s a great project, thank you for the link. Take my upvote stranger.


Regarding Simon and Speck: one simple answer is that the complicated attacks may exist and simple attacks certainly exist for smaller block and smaller key sizes.

However, it’s really not necessary to have a backdoor in ARX designs directly when they’re using key sizes such as 64, 72, 96, 128, 144, 192 or 256 bits with block sizes of 32, 48, 64, 96 or 128 bits. Especially so if quantum computers arrive while these ciphers are still deployed. Their largest block sizes are the smallest available for other block ciphers. The three smallest block sizes listed are laughable.

They have larger key sizes specified on the upper end. Consider that if the smaller keys are “good enough for NSA” - it will be used and exploited in practice. Not all bits are equal either. Simon’s or Spec’s 128 bits are doubtfully as strong as AES’s 128 bits, certainly with half the bits for the block size. It also doesn’t inspire confidence that AES had rounds removed and that the AES 256 block size is… 128 bits. Suite A cryptography probably doesn’t include a lot of 32 bit block sizes. Indeed BATON supposedly bottoms out at 96 bits. One block size for me, another for thee?

In a conversation with an author of Speck at FSE 2015, he stated that for some systems only a few minutes of confidentiality was really required. This was said openly!

This is consistent in my view with NSA again intentionally pushing crypto that can be broken in certain conditions to their benefit. This can probably be practically exploited though brute force with their computational resources.

Many symmetric cryptographers literally laugh at the NSA designs and at their attempts at papers justifying their designs.

Regarding NIST curves, the safe curves project shows that implementing them safely is difficult. That doesn’t seem like an accident to me, but perhaps I am too cynical? Side channels are probably enough for targeted breaks. NIST standardization of ECC designs don’t need to be exploited in ways that cryptographers respect - it just needs to work for NSA’s needs.


NSA doesn’t want NOBUS, they’re not a person.

NSA leadership has policies to propose and promote the NOBUS dream. Even with Dual_EC_DRBG, the claims of NOBUS were incredibly arrogant. Just ask Juniper and OPM how that NOBUS business worked out. The NSA leadership wants privileged access and data at nearly any cost. The leadership additionally want you to believe that they want NOBUS for special, even exceptional cases. In reality they want bulk data, and they want it even if the NOBUS promises can fail open.

Don’t believe the hype, security is hard enough, NOBUS relies on so many assumptions that it’s a comedy. We know about Snowden because he went public, does anyone think we, the public, would learn if important keys were compromised to their backdoors? It seems extremely doubtful that even the IG would learn, even if NSA themselves could discover it in all cases.


I think it's just both. It's a giant organization of people arguing in favor of different things at different times over its history, I'd guess there's disagreement internally. Some arguing it's critical to secure encryption (I agree with this camp), others wanting to be able to break it for offense reasons despite the problems that causes.

Since we only see the occasional stuff that's unclassified we don't really know the details and those who do can't share them.


There are plenty of leaked classified documents from NSA (and others) that have been verified as legitimate. Many people working in public know stuff that hasn’t been published in full.

Here is one example with documents: https://www.spiegel.de/international/world/the-nsa-uses-powe...

Here is another: https://www.spiegel.de/international/germany/inside-the-nsa-...

Please read each and every classified document published alongside those two stories. I think you may revise your comments afterwards.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: