Hacker News new | past | comments | ask | show | jobs | submit login
Ask HN: What encryption algorithms should we take as compromised?
64 points by Comkid on April 18, 2014 | hide | past | favorite | 48 comments
After constantly hearing all of the different revelations regarding NSA and their backdooring of various algorithms, I've totally lost track of which algorithms we should distrust and their replacements. For example, is SSH2-RSA known to be 'broken'?



"SSH2-RSA" isn't an encryption algorithm. It's a description of the SSH protocol using RSA authentication.

It is easier to provide the list of things that are worth worrying about than it is to list the things that are safe. There are a lot of as-yet unbroken ciphers and constructions. So, here are the things to avoid:

* Block ciphers in the default mode ("ECB").

* The Dual_EC random number generator, which virtually nobody uses anyways. You weren't going to accidentally end up using it. Or, for that matter, any other PKRNG (random numbers produced by public key algorithms).

* RSA with 1024 bit moduli (or below); RSA-2048 is your starting point. Conventional DH at similar key sizes will be an issue too, but there's a "means/motive/opportunity" issue for RSA-1024 given its prevalence.

* MD4, MD5, and SHA1 aren't backdoored, but are broken or weak. But: all three are survivable in HMAC (don't use them, though). SHA2 is your best all-around hashing bet right now.

* The NIST P- curves. There's no evidence to suggest they're backdoored, but (a) the rationale behind their generation is questionable and (b) they have other annoying properties.

So far as I can tell, you are now fully briefed on the "distrusted" crypto.

Don't build your own crypto. Use PGP for data at rest, TLS for data in motion, and NaCl for the rare in-between cases.


>* Block ciphers in the default mode ("ECB").

You can plainly see the problem ECB causes in this example image: http://legacy.kingston.com/secure/image_files/Figure2_ECB.jp...


This is a perennial favorite illustration of "what's wrong with the default mode", but it's biggest problem is that given chosen plaintext, you can often decrypt it a byte-at-a-time.


For the record DH key sizes are smaller than RSA keys of the same strength based on our current understanding of the computational effort involved in attacking them. DH-256 should be considered on the verge of too small. 340 or 512 will be necessary going forward.


I think I'm probably confusing terms here; you're referring to the size of "a", and I'm referring to the size of "p".

Perhaps 'pbsd will be around in a bit to resolve whether the index calculus will push the size of p or a first; my understanding is that it's bounded by the size of the modulus, and that most of the work it does is independent of the specific element of the group you're attacking.

I am definitely a lot fuzzier on DH key sizes than on RSA; we're getting into cryptanalytic attacks that don't have a lot of relevance to the kind of work I do.


The NFS complexity for factorization and discrete logarithm is asymptotically the same. In practice, the matrix step is more costly in discrete logarithm, but this should increase the bit security by at most a handful of bits, which is not enough to justify differentiating DL estimates from RSA estimates.

This refers to the size of p, the prime modulus. Sizes of exponents and/or subgroups are not affected by the complexity of the NFS, so they generally only need to be twice the target bit security to avoid birthday-type attacks.


So it's the size of p that matters most in practice, assuming other parameters are sane.


No ssh2-rsa is not known to be broken, although it's suspected that the NSA can factor some small (<=1024 bits) RSA keys if they really want to.

It's believed that any elliptic curve algorithm that doesn't have a transparent process for choosing the curve points may have been backdoored by the NSA choosing points that they already knew how to factor. If you use those curves, then you're revealing your secrets to the NSA but not to anyone else, because the discrete log problem is still (mostly) just as hard as it ever was.

Specifically, the elliptic curve random number generator in NIST SP 800-90A is believed to have been backdoored by the NSA. For obvious reasons no one has any hard proof, just very strong circumstantial evidence.

You can continue to use SSH2-RSA with decent size (2048 bit as a minimum) keys & AES. Those are not believed to be breakable at the current time, although as ever you can never have absolute certainty in these matters!


The word "may" is doing a lot of work in the sentence "may have been backdoored". What cryptographers are observing about the NIST P- curves is that it isn't impossible for them to have been backdoored; that there is a plausible technique that NSA could have used, given some an advance in ECC cryptanalysis unknown to public science but known to them, that could result in a backdoor.

Everything beyond that is the precautionary principle.

It's also really important to understand the difference between Dual_EC (the random number generator) and the NIST curves. There is much more circumstantial evidence against Dual_EC. Importantly, the potential backdoor in Dual_EC isn't really related to elliptic curves; you can describe a functionally similar backdoored RNG using other public key algorithms.


Your glass appears to be half full, mine half empty :)


No, it's not; the fullness of our glasses is orthogonal to the specific cryptographic issue we're discussing. I would recommend against the NIST P- curves.

One fortunate result of the Snowden disclosures is that for several reasons, some rational and some irrational, the market value of NIST/FIPS certification has plummeted --- it's still an issue if you're selling to the government, but no longer carries security cachet.

As a result, there's minimal upside to adopting cryptographic primitives and constructions simply because they have NIST standards backing them. Which means there's minimal upside to using the NIST curves.

Meanwhile, there are multiple downsides. One of them is the potential for backdoors, but I don't need to reach that issue in my analysis because another is the difficulty of safely implementing curve software with the NIST P-curves.


If that's the standard, mine must be empty.


> If you use those curves, then you're revealing your secrets to the NSA but not to anyone else.

...until some worker or contractor takes their "secret" values for himself, or sells them, or publishes them on the internet. Producing the public standards with the built-in master keys increases possibility of overnight global breakage.


The public standard shouldn't include the secret values, but rather identify the (verifiable) process for generating the public values, in order to assure people that they were not created from secret values.

See: https://en.wikipedia.org/wiki/Nothing_up_my_sleeve_number

(Or, of course, you could just not publish RNG standards based on public-key crypto ;-)


> The public standard shouldn't include the secret values

It seems there's enough evidence that NSA inserted the secret values in one standard already:

http://en.wikipedia.org/wiki/Dual_EC_DRBG


Both Snowden and Schneier said something to the effect of "trust the math." [1,2] Additionally the leaked Tor presentation [3] seems to indicate, that the NSA can not break the primitives used in Tor. So the algorithms that were considered secure before the Snowden leaks seem to be secure. ( But this is purely a statement about algorithms, you still need to use a well studied and tested implementation of these.)

[1] Schneier: http://www.theguardian.com/world/2013/sep/05/nsa-how-to-rema...

[2] Snowden: http://www.theregister.co.uk/2014/03/10/snowden_a_few_good_d...

[3] http://www.theguardian.com/world/interactive/2013/oct/04/tor...


In general you should prefer crypto constructions which are a result of global competitions. For example AES and SHA3.

You should avoid at all costs anything that has been standardized by NIST without going through years of reviews by international cryptographers. Dual_EC_DRBG is a clear example of crypto construction which falls into this category.

This is my general rule of thumb.

However knowing which ciphers one should use is not enough! You absolutely need to know HOW to use them. A basic and superficial example is AES in ECB mode, which is semantically secure as long as you use a key to encrypt one and only one single block. Another one is, for example, after how many encrypted blocks a key should be rotated, based on the underlying cipher used.

Once you have learnt how to use the basic building blocks of crypto you are then NOT supposed to write your own implementation and instead use existing ones....there is a small problem with this....they are broken or they either not implement all the necessary crypto constructions you need. OpenSSL is an example of broken crypto implementation, and instead NaCl does not have TLS implemented.

So this is a short summary and my personal opinion of why crypto is hard. On top of all this there are not enough experts out there which have the time to review crypto implementations or new and old constructions, and we are living a historical period where we desperately need crypto to protect our privacy.

So my final suggestions is to take some of your spare time and go through Dan Boneh Crypto 1 at Coursera: https://www.coursera.org/course/crypto

It is worth every single minute.

Once you have done that, I would also suggest you to take the Matasano Crypto challenges: http://www.matasano.com/articles/crypto-challenges/

Finally I want to thank everybody who have taken their time to create and maintain both Crypto 1 course and the Matasano challenges.


> In general you should prefer crypto constructions which are a result of global competitions. For example AES and SHA3.

The judges who chose AES and SHA-3 as the "winners" of the global competitions are the NSA.

> You should avoid at all costs anything that has been standardized by NIST...

That would include AES and SHA-3.


> The judges who chose AES and SHA-3 as the "winners" of the global competitions are the NSA.

Sure, however this process creates alternatives and if the crypto community thinks the winner is backdoored I am pretty sure we will know it and additionally we will have a valid alternative ready to be implemented. Additionally if the NSA/NIST modifies the specs for the crypto construction there is still the possibility to implement the original one. See SHA-3 for instance. It was about to be weakened, but the crypto community could still implement the original spec.

> That would include AES and SHA-3.

You cut the rest of the sentence and therefore changed completely the whole meaning. My original sentence included: "...without going through years of reviews by international cryptographers." Take a look at this video of D.J.B.: https://www.youtube.com/watch?v=G-TM9ubxKIg He makes a great example with the Dual_EC_DRBG, where many cryptographers told NIST that there could be a backdoor. NIST answer basically was: sorry too late, it has already been implemented !

So in other words, in case of Dual_EC_DRBG the standardization process was all in reverse. First NIST standardized it and then the crypto community started to review it and found problems.


If you're wondering what isn't compromised, the information here has withstood the test of time and scrutiny from the crypto community: http://www.daemonology.net/blog/2009-06-11-cryptographic-rig...

Barring some major advance in breaking crypto (which is entirely possible) it will probably stand for a long time to come.


That is the charitable way to describe Colin's suggestions. Another way to describe them is "well-aged", or "conservative".

Here are more modern alternatives to each of Colin's suggestions:

* Message encryption: AES-CTR+HMAC -> A fast native stream cipher (Salsa20) + polynomial MAC (Poly1305, VMAC).

* Standalone integrity checking: HMAC -> HMAC or SHA3.

* Hash: BLAKE2 or SHA3.

* Passwords: scrypt or, if not available, bcrypt.

* Public key encryption: ECDH + whatever you're using for message encryption, over Curve25519.

* Public key signatures: Deterministic ECDSA, EdDSA.

* Ephemeral key agreement: ECDH over Curve25519.

* Online backups: use Tarsnap.


Another way to describe them is "well-aged", or "conservative".

I absolutely agree. I am 100% in favour of being conservative when choosing cryptographic primitives.

All the alternatives you've mentioned have arguments in their favour. But unless you need to have signatures which are 32 bytes instead of 256 bytes, or you need to perform 10,000 private key operations per second instead of 1,000, or you need to build an ASIC which uses a few thousand fewer transistors, my recommendation is to be conservative.


I wonder whether I might not be able to convince you that RSA is less conservative than ECC. Certainly, I haven't once tested a piece of real-world software that got it right. Did you see that just the other day, a team managed to resurrect the PKCS1v15 padding oracle in JSSE? Is RSA plus OAEP really simpler than just using a curve to derive a key to encrypt with?


This question only makes sense if you give the threat-model to consider.

Is it only classical cryptanalysis on the cryptographic algorithm? Or do you take into account the programming mistakes (not necessarily related to crypto) of specific implementations? Or do you allow side-channel or fault-injection attacks, which will be able to break most algorithms, if they are not implemented with specific countermeasures?

In anyway, it is a very difficult question which doesn't have a single definite answer.


Exactly. A lot is lost when security deals strictly with theory instead of pragmatism. Theoretical breaks in crypto algorithms are important, but much weaker links in the chain are easier to attack. Using the best, unbreakable crypto does not protect you from more realistic attacks.

Obligatory XKCD:

http://xkcd.com/538/


For an n-bit RSA key "The absolute minimum size for n is 2048 bits or so if you want to protect your data for 20 years. [...] If you can afford it in your application, let n be 4096 bits long, or as close to this size as you can get it."

http://www.javamex.com/tutorials/cryptography/rsa_key_length...


Ciphers to avoid DSS, MD5/RC4, SHA-1.

Ciphers to prefer ECDH+AESGCM:DH+AESGCM:ECDH+AES256:DH+AES256

A pretty good source/guide:

https://hynek.me/articles/hardening-your-web-servers-ssl-cip...

You'll need apache 2.4+[I think], or nginx. And possibly fresh certs to use DHE/EC.

A quick rundown of a fairly secure setup:

Cipher Priority list:

ECDH+AESGCM:DH+AESGCM:ECDH+AES256:DH+AES256:ECDH+AES128:DH+AES:!ECDH+3DES:DH+3DES:RSA+AESGCM:RSA+AES:!RSA+3DES:!aNULL:!MD5:!DSS:!SHA:AEAD

==========================================================

Generate the cert and private key:

openssl req -x509 -sha256 -nodes -days 3650 -newkey rsa:4096 -keyout serverkey.pem -out servercert.pem

==========================================================

Generate the DH parameters:

openssl dhparam -out dh2048.pem -outform PEM -2 2048

==========================================================

How to List Elliptic Curves:

openssl ecparam -list_curves

===========================================================

Note: Generating DH parameters is gonna take a while. If you are implementing this on a slowish machine like a Raspberry Pi, you might want to use a faster machine to do the DH step, then copy file the key over.


Note to the mods: this comment breaks formatting on mobile, forcing the min page width to be much wider than usual and even more difficult to read on a phone.


This is generally helpful, but generally avoid advice from people who refer to signature algorithms or cryptographic hash functions as ciphers.

It should also be mentioned how you came up with your ordering of TLS cipher suites, in declining priority

  1. Forward security is preferred (ECDH|DH > RSA)
  2. AESGCM > AES256 > AES|AES128 > 3DES
  3. ECDH > DH


For ECC djb and Tanja Lange have put together a great list of how possible it was to tamper with each of the ECC primitives listed: http://safecurves.cr.yp.to/rigid.html


It's not just about compromised encryption algorithms, it's also about picking the right algorithm for a given purpose.

For instance, an hashing algorithm can be used to securely store passwords, and must therefore be slow, or to find duplicate files, a task which greatly benefits from speed. If you use a fast hashing algorithm to "securely" store passwords you might as well use a compromised algorithm since the security is nonexistent in both cases.

I think the same applies to crypto algorithms: it doesn't matter if the building blocks are individually secure if you don't know how to put them together in a secure fashion.


You might save yourself trouble by thinking of a "hash algorithm" as an infrequently-used primitive, a password hash (or KDF) as something you'd store a password authenticator with.


Nobody is arguing against that. The OP is asking about compromised algorithms. Sure, non-compromised algorithms can still be used incorrectly and be insecure, but compromised ones won't be secure no matter how they're used.

Also, you don't use hashes to store passwords, you use KDFs.


> Also, you don't use hashes to store passwords, you use KDFs.

Well, technically a hash can be seen as a particular key derivation function (KDF). Not a proper one for the purpose of storing passwords I agree, but then most KDFs are built using salt + an iteration of hash functions, to my knowledge at least (which I admit is not very deep on the subject).


A hash serves as the PRF in a KDF construction; it's a building block, not a subset.


SHA256, SHA3, AES, ECDSA and ECDH/ECIES are all good, plus one-time pads and Shamir's secret sharing. There's no real need to use anything else.


SHA256 is an instance of SHA2. Presumably you didn't mean that SHA512 was bad, or, better, SHA512/256.

ECDSA, ECDH, and ECIES (which we don't see a lot of) all require a curve. Saying "ECDH is good" isn't helpful if you can't safely choose a curve to run them over.

ECDSA has another problem: it has a hard randomness requirement. If you repeat the per-message nonce, leak bits of the message nonce, or even fail to fill the modulus for the message nonce, you set up a condition where attackers can recover your private key. DJB is trying to push ECDSA into disfavor, replacing it with deterministic signatures.

One-time pads are awful and should be avoided at all costs. Virtually every computer program developed by generalist programmers that claimed to be a "one-time pad" was instead a crappy stream cipher.

Shamir splitting is fine, although that's a strange thing to have in your "regular use" bag.


> SHA256 is an instance of SHA2. Presumably you didn't mean that SHA512 was bad, or, better, SHA512/256.

Right. But I don't see a reason why you would ever want either of those when SHA256 works just fine. Maybe because it's a bit more efficient per byte as a PRNG, but there are better specialized tools for that.

> ECDSA has another problem: it has a hard randomness requirement.

Not if you use RFC6979. In the Bitcoin space that's been standard since about three days after the Java.SecureRandom bug. As for curves, secp256k1 and curve25519 seem to be the most popular as far as I can tell.

> One-time pads are awful and should be avoided at all costs. Virtually every computer program developed by generalist programmers that claimed to be a "one-time pad" was instead a crappy stream cipher.

Well, yes, an OTP generated by a PRNG is a stream cipher by definition. You do need true randomness for them to work. I think they're very useful if (1) you're scared that NSA has a constructive proof of P=NP deep inside their lairs, or (2) you want ultimate deniable encryption.


SHA512/256 is safer than SHA256, for more than one reason. secp256k1 is "popular" for cryptocoins.


All.

All encryption is breakable. You aren't choosing an unpickable lock, you are picking how good of a thief it will take to rob you.

A 4096 bit encryption might make it really expensive to attack you, but those old numbers about "it would take a computer 40,000 years to crack" don't matter much in a world where that just means you spin up 160k instances in the cloud for 3 months.

That's a Dollar amount that makes cracking YOUR bank account not worth doing. But if it were the Nuclear launch codes for Russia's arsenal it would not be undoable.


Your scale is way off.

To brute-force AES-128, if you assume:

- Every person on the planet owns 10 computers.

- There are 7 billion people on the planet.

- Each of these computers can test 1 billion key combinations per second.

- On average, you can crack the key after testing 50% of the possibilities.

Then the earth's population can crack one key in 77,000,000,000,000,000,000,000,000 years.

Source: Seagate, http://dator8.info/pdf/AES/3.pdf


Here is a better article.

http://www.eetimes.com/document.asp?doc_id=1279619

But they are both still wrong.

A. The rate of Keys per second on both are way low, and B, you don't have to test every combination, Certain combinations will tell you that whole chunks of possibilities are not possible.

In truth most of the time you can narrow the potentials to 1% of the total possible to determine a range for the right answer pretty quickly.

Granted if it was as slow as 77 Billion years .7 billion years is still a long time. But no, these numbers are orders of orders of magnitude wrong.


160,000 EC2 instances running for years won't make a dent in a 2048 RSA key.


If you have a large enough table of primes and factors RSA 2048 is only about 5% harder to crack than a 1024.

If you don't have one it is billions of times harder. How large is your Prime table? How large is mine? How large is the NSA's?


No.


I'm of the opinion that trusting any of them at this point could disappoint.


Can we really trust any of the algorithms from the NSA Suite B?


You mean, should we trust SHA2 and AES?




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: