> How is missing a critical security issue that endangers all encryption offered by OpenSSL for more than a year (giving attackers access to your private keys via a basic network request) "working in this case"?
The fact that it was found by people outside the project is the system working.
> If you don't think finding Heartbleed after a year in OpenSSL was the process working, how about finding Shellshock was hiding in Bash for more than 20 years? Was that still a shallow bug, or is bash a project that just doesn't get enough eyes on it?
Yes it's a shallow bug. I mean look at it. And look at who found it.
> Rather than releasing support for TLS heartbeats that steal your private keys for a whole year, it would have obviously and indisputably been better if the feature had been delayed until a proper security audit had been performed.
How much auditing do you realistically think a project with a grand total of one (1) full-time contributor would've managed?
If the code hadn't been publicly released we'd still be waiting for the bug to be found today.
> The fact that it was found by people outside the project is the system working.
This happens all the time to Windows, to Intel's hardware architecture, even to remote services that people don't even have the binaries for. There is nothing special about people outside the team finding security bugs in your code. After all, that's also what attackers are.
> Yes it's a shallow bug. I mean look at it. And look at who found it.
If a bug that hid from almost every developer on the planet for 20 years (that's how popular bash is) is still shallow, then I have no idea how you define a non-shallow bug.
> How much auditing do you realistically think a project with a grand total of one (1) full-time contributor would've managed?
That's irrelevant to this discussion. Per the essay, even a company as large as Microsoft would be better off releasing anything they do immediately, instead of "wasting time" on in-house security audits.
> If the code hadn't been publicly released we'd still be waiting for the bug to be found today.
I'm not saying they shouldn't have released the code along with the binary, I'm saying they shouldn't have released anything. It would have been better for everyone if OpenSSL did not support heartbeats at all, for a few more years, rather than it supporting heartbeats that leak everyone's private keys if you just ask them nicely.
This is the point of the Cathedral model: you don't release software at all until you're reasonably sure it's secure. The Bazaar model is that you release sofwatre as soon as it even seems to work sometimes, and pass on the responsibility for finding that it doesn't work to "the community". And the essay has the audacity to claim that the second model would actually produce better quality.
> There is nothing special about people outside the team finding security bugs in your code.
That supports the point.
> If a bug that hid from almost every developer on the planet for 20 years (that's how popular bash is) is still shallow, then I have no idea how you define a non-shallow bug.
A bug where you think "yeah, no-one except the x core team could ever have found this". A bug where you can't even understand that it's a bug without being steeped in the project it's from.
> That's irrelevant to this discussion.
Disagree; that the Bazaar can attract more contributions is a big part of the point.
> This is the point of the Cathedral model: you don't release software at all until you're reasonably sure it's secure. The Bazaar model is that you release sofwatre as soon as it even seems to work sometimes, and pass on the responsibility for finding that it doesn't work to "the community".
Few people were thinking about security at all in those days, at least the way we think about it now; the essay isn't about security bugs, it's about bugs generally. The claim is that doing development in private and holding off releasing doesn't work, because the core team isn't much better at finding bugs than outsiders are. The extent to which a given project prioritises security versus features is an orthogonal question; there are plenty of Cathedral-style projects that release buggy code, and plenty of Bazaar-style projects that release low-bug code.
It did the literal opposite: the TLS Heartbeat Extension was itself a bazaar (and bizarre) random contribution to the protocol. The bazaar-i-ness of OpenSSL --- which has since become way more cathedralized --- was what led to Heartbleed, both in admitting the broken code and then in not detecting that code regardless of the fact that it's one of the most widely used open source projects on the Internet. It comprehensively rebuts Raymond's argument.
The fact that it was found by people outside the project is the system working.
> If you don't think finding Heartbleed after a year in OpenSSL was the process working, how about finding Shellshock was hiding in Bash for more than 20 years? Was that still a shallow bug, or is bash a project that just doesn't get enough eyes on it?
Yes it's a shallow bug. I mean look at it. And look at who found it.
> Rather than releasing support for TLS heartbeats that steal your private keys for a whole year, it would have obviously and indisputably been better if the feature had been delayed until a proper security audit had been performed.
How much auditing do you realistically think a project with a grand total of one (1) full-time contributor would've managed?
If the code hadn't been publicly released we'd still be waiting for the bug to be found today.