Hacker News new | past | comments | ask | show | jobs | submit login

Interesting that the reason for SHA-3 has been missed in that the finalists offer no better way to hash with the main difference being some are faster and some slower than the best SH2 variations.

What does this mean, well in effect no extra value is being directly offered, sure some have extra abilities by design like being more able to liberate parallel processing by sbeing able to split the data to be hashed into chunks and work on partial blocks of the final data and use the results to get the final hash result. That is nice.

But when it comes to brute forcing then being faster works against you, also the ability to work on partial chunks of the data allows you to modify the code and rechecking the partial hash for the part your changing until you get the same result, this alows you to do nasty things to code and get the official hash answear alot easier than having to rehash the end result every time and getting the same result or modifying the code to get the same result (usualy have area you jump over all nop and modify that to influence the hash, but more sane ways to do this but offtopic).

So in essence any hash that can be run faster in any way will make it weaker in terms of brut forcing (yes I know people assume there passwords will be the last one on the list to be checked bia brute forcing and assume if it takes 10 years to test all variations then there password is 10 years strong, you see the flaw in mentality there).

Now NIST still have an opertunity here and it is a simple, tried and tested approach and that would be to have all finalists winners and have them all in the standard as variations. This then allows end users/admins to pick there variation of choice or even perish the thought allow mixed usage so say your /etc/password file could have some users using one variation, others using another, etc. Whilst it add's no obvious extra benifit, it will allow more variations and in that fallbacks/choice and that is what n BIT encryption/hashing is all about, each bit being a choice in a way.

So in summary I believe NIST should let them all win and have SH3.n with n being the variation of finalist, let them all win, choice is good and that is what n bit encryption is after all, extra choices.




Ugh. Being faster does not work against secure hash functions! Holding all else equal, faster is invariably better.

What you're thinking of are password hashes, which are a variant/application of key derivation functions (KDFs). KDFs often use secure hash functions, which is where the confusion comes from.

You want your core crypto to be as fast as it conceivably can be, because you want to be making progress towards a state where all communications are encrypted by default.


In some instances yes faster is always betetr and take AES, whilst initialy was slower, now that it is enabled in hardware it is now alot faster.

The point I was making is that it is a concideration and the general mentality is that the larger the version number then the better it is and a point the original article was making in that none of them are any better than what is on offer with regards to security. Its is tha aspect of being able to get a large hashed file and modify part of that file and recheck just that partial HASH without having to rehash the whole thing. This for comminucations starts to open up a faster way to modify encrypted communications as by changing a small part you only have to rehash that part and know the final block is still ok. This is a area which makes by design any hash function can work with partial blocks, less secure.

So fast is good but it often comes as a compromise against security and any new standard should at least be better than what it is designed to replace and not open up whole new avenues of attack.


> was making in that none of them are any better than what is on offer with regards to security

In this case, possibly. It is quite clear by now that SHA-1 and MD5 are flawed, so the 'higher version' SHA-2 variants (especially the bigger ones) should be preferred.

> So fast is good but it often comes as a compromise against security and any new standard should at least be better than what it is designed to replace and not open up whole new avenues of attack.

Brute force attacks against 512 bit hashes are not practical today, and won't be practical for a long time. The concern with password storage is seldom pure brute force attacks, but rather attacks against a dictionary. This is because, for password storage, the input entropy is typically much less than 512 bits (or even 128 bits). It's a completely different use case.

> Its is tha aspect of being able to get a large hashed file and modify part of that file and recheck just that partial HASH without having to rehash the whole thing.

Is this an argument against hash trees? Can you explain more about the this potential attack? It seems to be to be equivalently hard to finding preimages.


Brute force attacks against 512 bits (or 256 bits or 128 bits) aren't practical period.


>Is this an argument against hash trees? Can you explain more about the this potential attack? It seems to be to be equivalently hard to finding preimages.

If you only have to rehash a branch as apoosed to the entire tree and match hash's then you have a easier time as it is alot faster by design.

Now if the way the hashing work is that only say 1234567 will get the hash value 11 and no other variation then i will have no issues and welcome this as a great achievement and pure brilliance. But I dont feel this is the case and nor could it be by reducing any large amount of entropy into a shorter definition and that is what a hash function does after all and one hash value will match more than the original data.


Actualy if you pick the block to paralise striped (ie 100 bytes and block 1 is every 10th byte so byte 1,11,21,31... and block 2 is 2,12,22,32....) then the ability to modify the code/data in any nefarious means would be as hard (if not harder ) than having to rehash the entire lot.

The only issue with this approach of blocking is that it works on all the data and as such would for example be no use for streaming which is a terrable exmaple but you get the idea.


Just pulling off a preimage attack on a partial block is as hard as pulling off a preimage attack on the whole thing - you still need to find two inputs which hash to the same output.


Can you recommend a concise taxonomy of crypto that spells out these kinds of distinctions clearly? I think it would be particularly interesting to see these "X uses Y" (ie. KDF uses secure hash) relationships spelled out, to better understand the "primitives" of crypto and how more specialized functions are built on top of them.

I would never have thought of a password hash as a KDF, because you don't use the key for anything except to compare equality. I also wouldn't have thought that an important property of a KDF is for it to be slow. In the case that you're using a KDF to compute an actual encryption/decryption key, this property does not seem important, because the output (the key) is just as sensitive as the input (the password).


Password hashes and key derivation functions can be totally different - key derivation functions only need to be slow if they're intended for low-entropy input, while password hashes in no way need to maximize entropy (e.g. "bcrypt, then 128 zeroes" is a perfectly fine password hash, but I wouldn't want to use the result as e.g. an AES key.)

In practice, though, it's desirable for password hashes to maximize entropy, which makes them usable as key derivation functions; and the key derivation functions that you usually need take passwords as input.


Not all KDFs need to be slow, but only those that take as input a low-entropy secret (e.g., a password).


> So in essence any hash that can be run faster in any way will make it weaker in terms of brut forcing

Hash functions have many, many uses beyond password storage. For most of those uses they become more, not less, useful with speed. Fast hash functions are good for everybody. If you explicitly want a slow expensive construct, then chose one properly designed to be slow and expensive. Don't just chose a crappy hash function.

> So in summary I believe NIST should let them all win and have SH3.n with n being the variation of finalist, let them all win, choice is good and that is what n bit encryption is after all, extra choices.

I completely disagree. An organization like NIST has a responsibility to have an opinion on what the 'best' hash function is. They then need to track the state of the art of research that might invalidate that decision, and clearly communicate changes in the decision. While there is a defense-in-depth argument to be made for multiple options, the pattern seems to have been that there are a lot more systems broken because they chose a poorly studied or known bad algorithm than by breaks being found in previously known-good algorithms. We have a lot to lose from everybody making it up as they go along.


If NIST pick just one and they later find a issue via research in that one solution then the whole standard is dust and if SH3 becomes no more. If they pick more than one as I said all of them for example then if one of those vatriations is found to be flawed later on then that subset can be dust and the SH3 standard and implementation can carry on moving on without have the issue of suddenly having nothing to fallback upon. Sure there are other standards that currently can be fallen back upon but if some kit supports SH3 only then with SH3 having variations as a standard can only be better thing than not. These finalists have been tested alot already, more so than previous so it is not a case of making it up appraoch at all, they would of had to pass alot of standard/hurdles to get this far. But what I'm saying is if they all pass all the tests then picking a winner gets down to other nuances and the abilty to have more than one winner add's more rubustness to the standard in that any single solution found later on to have a flaw would negate the whole standard as apposed to a subsection of it by having more than one option, hence the SH3.n.

That all said if everybody agreed on everything then life as we know it would be boring and we would all be fighting over the same women at some stage, which would not work out well.


"That all said if everybody agreed on everything then life as we know it would be boring and we would all be fighting over the same women at some stage, which would not work out well."

This would be especially awkward, since apparently she would also be fighting over herself and presumably would just elope with herself.


You do not want standards that have good parts and bad parts. If SHA3 blows up, then they pick something else and call it SHA4. And then everybody can look at the label on the box and know whether it's good or bad without having to read the entire list of ingredients.


> But when it comes to brute forcing then being faster works against you,

We are probably still talking less than an order of magnitude. So that slowness isn't going to save the day in theory. It might in practice but if it comes that close, the implementation will be deemed broken and something else will be advocated.

However a slow function means a lot of cumulative power and time wasted in the years to come to execute this new hash function. So I'd opt for a faster one.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: