Hacker News new | past | comments | ask | show | jobs | submit login
OpenSSL's implementation of DSA appears to be broken (ludost.net)
119 points by e12e on Sept 5, 2015 | hide | past | favorite | 25 comments



The linked discussion between Guninski and Peter Gutmann on the Cypherpunks list is much more interesting.

Since there's probably no implementation of any crypto standard that has ever gotten parameter validation right, calling it a "backdoor" is a bit of a stretch.


I think there is an argument to be made that the ongoing requirement for "flexible" crypto represents a backdoor. We know enough to say it'll never be right. Even in the 90s, when we didn't perhaps have hard earned experience, I think we knew enough to say it wasn't proven secure, and therefore the safe choice would be less flexible.

No, it's not specifying a particular backdoor, but if you make something hard enough you can be assured one will develop organically. And even if it's not deliberate malfeasance, can we at least call it callous negligence?


TLS is a pile of IETF design-by-committee extensions uncomfortable with saying "no;" OpenSSL is a mountainous kitchen sink of TLS and other features. OpenSSL seems to "work," but the endless pattern of vulnerabities lies in the risky process in which code is developed, lightly tested and pushed out as a production release... That hasn't changed enough to convince me they're going to get proactive and stop SNAFUBAR anytime soon.

'cperciva was on the right track about a better protocol would encode parameters much more simply and have just enough options for an important gamut of use-cases. The issue is we're seemingly stuck on TLS everywhere and would require massive cost and coordination pains to use something, even if it were 10X better and formally-proven.


It would be fairly cheap if we got our secure-connection APIs right. Upgrading the protocol should be as simple as upgrading any system library.


The linked discussion between Guninski and Peter Gutmann on the Cypherpunks list is much more interesting.

Yes, these two messages in particular are worth reading:

https://cpunks.org/pipermail/cypherpunks/2015-September/0090...

https://cpunks.org/pipermail/cypherpunks/2015-September/0090...


OpenSSL allows (by virtue of doing nothing to check for) composite p and/or q for RSA keys. Is there something here beyond "if you use parameters from an untrusted source you might have a bad time"?


I don't think so.


I've discovered similar issues in TLS implementations, but I don't think this is actually a vulnerability without more details. Just because a client will use bad parameters doesn't mean that it will use its existing key with those bad parameters. It shouldn't be hard to make a PoC if this is in fact what is going on.


I don't completely understand the description but I think they managed to extract the private key of the server.


Why don't we start implementing this kind of software in ways that can be formally verified? Sounds crazy hard, but it's not that difficult.

For example, Airbus fly-by-wire system was proven not to have any runtime errors using abstract interpretation. And it's coded in a subset of C. [1]

One can formulate other analyses to look for e.g. leaked information.

I think 70s tech doesn't cut it when dealing with formidable adversaries like NSA.

[1] http://www.astree.ens.fr/


Proving formal properties requires formal specifications.

“absence of undefined behavior”, the property that Astrée more or less verifies, is one more or less formal piece of specification that authors of formal tools for C get for free. In reality, they get to formalize it themselves, because the C standards are written in English and contain much ambiguity, and their interpretation evolves over time. Two decades ago, signed arithmetic overflow was described as undefined because everyone “knew” that the authors of the standard wanted to accommodate 1's complement and sign-magnitude, and therefore everyone assumed that if you knew your architecture was 2's complement, you could expect 2's complement behavior.

The issue being discussed here is not one of having undefined behavior, though. OpenSSL is being blamed with not satisfying a property that it wasn't even documented as having, much less formally specified.

Your comparison to aeronautics code is also omitting the price of developing to aeronautics safety standards. For one thing, in aeronautics it is decided in advance what the software will do, and the software is implemented to have only these features. This is not how OpenSSL got to where it is today!

Astrée is only used to verify one aspect of the safety of part of the code in the aircraft, so it is really not a good example to use here. If you wanted to make a strong case, you could use miTLS: http://www.mitls.org/wsgi/home

And since miTLS already exists, you can ask the question of why it is not used in place of OpenSSL. Part of the answer is that it is F# and is only likely to interface easily with .NET projects.

My employer sells formal verification reports for open-source software. One piece of software we have formally verified with a tool comparable to Astrée is PolarSSL: http://trust-in-soft.com/polarssl-verification-kit/ This verification did not check whether the algebraic properties of any key are correctly validated in any of the cases where this would be relevant, because that's outside the perimeter. The verification sets out to verify that the code doesn't have any of the problems of the sort that Astrée would find (and a bit more), and it does exactly that.


The work you have been doing on polarssl is pretty inspiring. I use it as a success story for formal methods and PL technology when I do "science of security" classes or briefings.

Someday, I would love to do a research project on using tools like yours to prove higher level properties of programs, much like what the Ironclad/Dafny teams have done. Their approach is good when you have a formal spec and can write from scratch, your approach is good when you already have the code and can create a spec post-hoc, I think. Have you followed their work?


I have followed the Dafny language from a distance since the beginning. The Ironclad project I am just discovering, but it is everything I would expect from Microsoft: a willingness to give formal methods a chance at real use and to get valuable feedback.

The sad truth of from-scratch formal methods is that only Microsoft-like companies can afford to give them a try. And so far, of the Microsoft-like companies, only Microsoft is doing so.

Regarding the verification of PolarSSL, you may find it useful to see what the entire report looks like concretely, even if it is for a now obsolete version of PolarSSL, so here: http://trust-in-soft.com/polarSSL_demo.pdf

We have improved the methodology somewhat since that report was made, but it does give a prospective customer an idea of what they can expect for a recent branch of PolarSSL and for a usage of the library that we have agreed on before elaborating the report. TrustInSoft is going to announce the availability of this demo report this week; consider this a sneak preview.


Yes, these reports are nice. I've seen this kind of thing before and I've worked with frama-c and other tools personally.

I think that there's some interest in from-scratch methods growing in government circles now that there are some empirical results, and that the consequences of rampant insecurity are starting to be broadly observed. We'll see if we can collectively get out in front of it though...


Formal methods have their place.

I think we do need more waterfall-type design and documentation of cryptosystems (not massive pointless documentation, but relevant documentation and security/engineering tradeoffs design justifications.)

Also, for more and better eyeballs to audit code, it needs to be simple and clear code.

More thorough C unit/system testing, fuzzing and static/dynamic analysis steps as standard would help... I think a project/startup which could be like a TravisCI for the security and correctness of C code would work because there's just so much C code which needs better coverage.


This would help how in this case?



Can someone explain the vulnerability this post is talking about in simpler terms?


I feel like if an implementation (or RFC) had a spelling error in it, people would be screaming backdoor. Granted, this spec is from 1996? But I think the fact that nobody ever used this speaks for itself; why backdoor a standard nobody would ever use?


Why would anyone use DSA in this day and age anyway?

EdDSA is where it's at.


People use DSA for the same reasons they use RC4, 3DES, and MD5: that's what the other party's 10 year old legacy platform that will never be updated supports.


Then those others parties get hacked to hell and put out statements such as "We care deeply about our customers' security!"


It's a hard call for service providers. Sort of catch-22. They have to support old ciphers because most of their possible customers are running old software or operating systems that don't support newer ciphers. So to make money they're effectively forced to take a more insecure position. They can care deeply about customer security, while effectively being forced to offer only the most secure that their customers will support, and having to accept that ugly compromise as a necessity of remaining in business.


Setting up a proxy for those customers is possible. Different end points for those customers is possible. Reverse proxy is not such a hard setup task these days.

But really it's about everyone just being human, and therefore crap at these things. Once security 'products' realise this, and cope well with people being rubbish at their jobs, then those products will make a lot of money... If only the people measuring ROI were also not rubbish at their jobs... ;)


Only if you care about performance. But not everybody is happy about the chosen magic curve parameters, which could include backdoors.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: