Hacker News new | past | comments | ask | show | jobs | submit login
Researchers identify first flaws in AES (net-security.org)
155 points by fogus on Aug 17, 2011 | hide | past | favorite | 35 comments



I guess once SHA-3 is done, NIST better start research on the next symmetric cipher. The attack only knocks off two bits now, but attacks only get better with time.

Note that you should be using accepted standards for key management anyways. NIST believes that AES-128 has 128 bits of strength, and AES-256 has 256 bits of strength [1, page 63], which they find "Acceptable" until 2031 [1, page 65]; the catch is that any individual key should be expired after a few years. This is called a "cryptoperiod," and the longest they recommend using a key is five years [1, page 55].

By this standard, keep using AES-256, but change your keys every year, and be prepared to migrate to a new cipher at some point within the next decade or two.

[1]: http://csrc.nist.gov/publications/drafts/800-57/Draft_SP800-...


The total cost to the industry of adopting AES was enormous. That implies that deprecating it would at present impose costs that far outweigh the benefits, because replacing AES will be expensive, and there are at present minimal practical benefits to doing so. Remember, plenty of critical systems have been fielded that rely on RC4 and DES-EDE, both of which are also (for all intents and purposes) unbroken in their most common usage.

For Hacker News, the far more important thing to point out is that no matter what block cipher you use, you are never going to get burnt by research like this, but you are almost certain to be completely boned by implementation errors.


I would consider key management to be a greater problem than implementation, at this point. There are many verified, open-source, freely-licensed cryptography libraries out there. You shouldn't be rolling your own implementation.

Unfortunately, key management is still a difficult problem. Without secure keys, even the most secure encryption becomes nothing more than an inconvenience.


Everyone says this, because Applied Cryptography (not a good book) says it. But no, the problem of coming up with a good 128 bit or 256 bit key is not the hardest part of getting a cryptosystem working. Plenty of static-keyed systems, in which keys are more or less guaranteed to come from /dev/random, end up totally busted.

You seem to be thinking that when I say "implement", I mean, "building an AES core library from scratch". No. I mean properly using a well tested, totally sound AES library in a way that doesn't leave your application totally boned.


I've seen a lot of broken key management. People using MD5(passphrase) (cough OpenSSH cough) as a key, people downloading public keys over insecure networks and not verifying them at all, people distributing secret keys along with software destined to be run on hostile systems...

I'm happy to say that systems which try to generate random passwords do tend to generate securely random passwords, though. /dev/random has been a huge help there.


You wouldn't want to leave anyone with the impression that they're mostly fine as long as their keys are securely generated.


No, of course not. That's why I work so hard to point out the ways that people get crypto wrong (and how to do it right).

I'm just saying that key management is absolutely not a solved problem yet.


Yeah yeah, not arguing, just wanted us both on the record on this point. Crypto: dangerous.


>people downloading public keys over insecure networks and not verifying them at all

And note that SSL download public keys over insecure networks. Verifying them is why certificates were created


Come on, "coming up with a good 128 bit or 256 bit key" is only a very small part of key management (and, as you note, probably also the easiest part of key management).

You know this.


You misunderstood me.

What I did was come up with the simplest, most reliable key management system there is (static keying with secure keys), and presumed it to be safe.

Then I pointed out that such an application is still likely to be boned by implementation flaws that have nothing to do with how the keys are established.


> You seem to be thinking that when I say "implement", I mean, "building an AES core library from scratch". No. I mean properly using a well tested, totally sound AES library in a way that doesn't leave your application totally boned.

Can you please fix the images on your blog post about that :)


For Hacker News, the far more important thing to point out is that no matter what block cipher you use, you are never going to get burnt by research like this, but you are almost certain to be completely boned by implementation errors.

Wish I could up vote you twice for this.


The total cost to the industry of adopting AES was enormous

Part of that was the cost of getting rid of years of cruft, though. Dropping in a new block cipher which operates on 128-bit blocks using 128-, 192-, and 256-bit keys would probably be much cheaper.


one should also note that even if we take a fantasist hypothesis of a major breakthrough demonstrating a loose of strength of 2 bits for AES-128 once a year during ten years, we would probably still have tens of years to replace the cipher. Of course this theory is full baseless hypotheses, but even more is the theory that would says that there is a risk of a major breakthrough tomorrow reducing the strength of AES-128 to 64-bits.

There is no need to hurry.


Speaking of SHA-3, "biclique cryptanalysis was first introduced for hash cryptanalysis". Here's an interesting quote from the paper regarding the relationship between attacks on block ciphers and hash functions:

"... the block cipher standard AES is almost as secure as it was 10 years ago in the strongest and most practical model with a single unknown key. The former standard DES has not seen a major improvement since Matsui’s seminal paper in 1993.

In contrast, the area of hash function cryptanalysis is growing quickly, encouraged by the cryptanalysis MD5, of SHA-0 and SHA-1, followed by a practical attack on protocols using MD5, preimage attacks on Tiger and MD5, etc. As differential cryptanalysis, a technique originally developed for ciphers, was carried over to hash function analysis, cryptanalysts are now looking for the opposite: a hash analysis method that would give new results on block ciphers."


Note: This attack requires a large amount of RAM. They may have saved a few operations, but they increased the total cost by many orders of magnitude over a brute-force search because they need a larger machine.

Also note: An attack taking 2^126.1 operations to break AES-128 is not 2^1.9 times faster than brute force; it's 2^0.9 times faster than brute force. A brute force search on a 128-bit key takes on average 2^127 operations.

In short, this is definitely in the "interesting new method to explore" column rather than the "interesting attack" column.


Also note: An attack taking 2^126.1 operations to break AES-128 is not 2^1.9 times faster than brute force; it's 2^0.9 times faster than brute force. A brute force search on a 128-bit key takes on average 2^127 operations.

I'm confused (though admittedly I haven't read the paper fully and don't have any background in cryptanalysis, so I guess it's not surprising that I'm confused). It looks like 2^126.1 is a worst-case complexity for this attack, so it makes sense to compare that complexity to the worst-case complexity of a brute force attack, doesn't it?


My understanding was that was an average complexity for the attack. It's possible that I misunderstood it, though -- I didn't read the paper in detail.


"Note: This attack requires a large amount of RAM. They may have saved a few operations, but they increased the total cost by many orders of magnitude over a brute-force search because they need a larger machine."

Is this terribly relevant though? Deep Crack was a quarter of a million dollars (90s dollars) as I recall.


Yes. The entire concept of cryptanalysis revolves around cost factors. It is very relevant when a new avenue of attack costs more than brute force in its best known mode of implementation.

That doesn't make the research a dead end, but it further reduces the relevance of this work to engineers.


It depends on whether "a large amount of RAM" means "a quarter of a million dollars' worth of RAM" or "an amount of RAM substantially larger than the Milky Way galaxy".


I don't know the details of the attack, but the article says, "AES-128 is more like AES-126", so you should therefore need 2^125 operations in average, which is 4 times faster.

On the other hand, I may be wrong... do you mind linking to the original source?


dchest linked below. Their attack on AES-128 is claimed to be 2^126.1 operations.



Ah!

Novel cryptanalysis technique reduces practical AES key length by about 2 bits.


Which is quite an achievement.


Does anybody remember or have a citation for the story about how the NSA made some recommended changes to Rijndael, but nobody understood them at the time? This was back in the mid-90s, during the AES/DES replacement assesment and competition period.

It was then only years later that the NSA changes were understood to be strengthening the cypher against an attack type that was unknown to the public at the time, but that the NSA obviously knew about. I remember there were also suspicions that the NSA were attempting to weaken the algorithm, much as they did with DES.

It was the case that the NSA were always 10-15 years ahead of the public in crypto (or even longer in terms of the discovery of public key crypto). I wonder if their old proposed changes to Rijndael have anything to do with this, or if they knew about this type of attack but still recommended AES.

Apologies for being vague about the story - but I do remember those details. I can't recall where I heard about it.


You're misremembering, twice. There was a story about mysterious S-boxes selected by NSA for DES http://en.wikipedia.org/wiki/Data_Encryption_Standard#NSA.27.... Those S-boxes turned out to be designed to resist differential attacks, which were only known to NSA at that time. This doesn't count as weakening at all.

The difference between AES and Rijndael is only in that AES has a fixed block size of 128 bits and a key size of 128, 192, or 256 bits, whereas Rijndael can be specified with block and key sizes in any multiple of 32 bits, with a minimum of 128 bits and a maximum of 256 bits. (http://en.wikipedia.org/wiki/Advanced_Encryption_Standard#De...)


Well, we certainly won't have to worry about it for now. But that doesn't mean we shouldn't worry about it, of course.

It's a ways off in terms of orders of magnitude for how fast our computers now, and I don't think computers will be growing at that rate in terms of power unless new breakthroughs are found.


The general rule of thumb for cryptanalysis attacks is that they become cheaper over time (usually quicker than initially proposed), which is why finding issues (even mathematical issues that aren't yet practical) is so important.

Considering the length of time that it takes for new encryption standards to be proposed and accepted, it's important to try and stay ahead of the curve.


If you are using AES directly in your code, you should be very, very worried no matter what the research says, because you are vanishingly unlikely to have gotten it right.

Use TLS or GPG.


If you couldn't crack it before, you still can't crack it now. This attack only makes it 4 times easier to break, not even an order of magnitude.


If you count in binary, 2 bits are two orders of magnitude.


Don't panic yet. It still requires a trillion machines for billions of years to brute force.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: