Hacker News new | past | comments | ask | show | jobs | submit login

Good read! Does this new proof have an effect on cryptography given that it's often desirable to have highly sensitive cryptographic functions (you flip one bit in the input and get a very different output)?



Nah. Computing sensitivity has always been easy enough, flip bits and watch output. This is about how to do it in a generalized way that fits in with related work in the field.


Sensitivity here is defined as

Take all possible input strings

Given any input string, the sensitivity of that string is how many bits you could flip that cause the output to flip.

Now take the /maximum/ of all input string sensitivities.

For e.g. a hash function, you'd want either a minimum or something like a 1st percentile.


For e.g. a hash function, you'd want either a minimum or something like a 1st percentile.

As far as I know, all cryptographic hash functions are sensitive to single bit-flips by design.


It is definitely a design goal, but I would be very surprised if sha2 did not have 2 inputs that only differ by a single bit, and have the same output.

Finding those inputs is essentially impossible, but for a true 'random oracle ' they are likely to exist.


Note: correct me if I’m wrong, but to be relevant to this theorem and definition of sensitivity, the two strings would have to be for the same circuit and therefore be the same length, which is much harder than finding hash collisions in general.

And I thought they had ways to construct hash functions so that all the inputs of the same length have a different output?


> And I thought they had ways to construct hash functions so that all the inputs of the same length have a different output?

That's trivially impossible for fixed-size hashes, by the pigeon hole principle.


Sorry, I meant all inputs whose length equals the output length.

The point being, for at least that case you can guarantee a sensitivity of one bit.


Not if the output is longer than the input, such as for passwords


I doubt any study of hash functions restricts itself to studying a very tiny subset of all possible (infinite) inputs. The security of hash functions is always studied under the assumption of arbitrary inputs.


Yes, but sensitivity is the wrong measure to prove that. It’s the difference between a big-O and small-o complexity.


"Given any input string, the sensitivity of that string is how many bits you could flip that cause the output to flip."

The sensitivity of that string is the minimum number of bits that need to be changed in order to change the output?

Not being snarky - just want to see if I understand. The phrase "how many bits you could flip" is ambiguous.


The number of single-bit flips that would change the output on their own.

From the article: "If, say, there are seven different lies you could have told that would have each separately flipped the outcome, then for your loan profile, the sensitivity of the Boolean function is seven."


Great question. Then it follows, if yes, that the state cryptographers may even have solved this long ago but can't disclose it.

With all the leaks lately from inside those places, it would be nice if some basic math results could make it outside.




Consider applying for YC's Fall 2025 batch! Applications are open till Aug 4

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: