Hacker News new | past | comments | ask | show | jobs | submit login

> From a security perspective there are only two kinds of code bases: open & closed. By deduction one of those will have more eyeballs on the codebase than the other even if "nobody looks".

Is that the classification that matters? I'd think that there are only following two kinds of code bases: those that come with no warranty or guarantee whatsoever, and those attached to a contract (actual or implied) that gives users legal recourse specific party in case of damages caused by issues with that code (security or otherwise).

Guess which kind of code, proprietary or FLOSS, tends to come with legal guarantees attached? Hint: it's usually the one you pay for.

I say that because it's how safety and security work everywhere else - they're created and guaranteed through legal liability.




Can you cite an example where a company was sued over bad code? I want to agree with you and agree with your reasoning (which is why I upvoted you as I think it is a good argument) but cannot think of any example where this has been the case. Perhaps in medical/aviation/government niches but not in any niche I've worked in or can find an example of.

The publicly known lawsuits seem to come from data breeches and the large majority of those data breeches are due to non-code lapses in security. Leaked credentials, phished employee, social engineering, setting something Public that should be Internal-only, etc.

In fact, in many proprietary products they rely on FLOSS code which resulted in an exploit and the company owning the product may be sued for the resulting data breeches as a result. But that's an issue with their product contract and their use of FLOSS code without code review. As it turns out many proprietary products aren't code reviewing the FLOSS projects they rely on either despite their supposed potential legal liability to do so.

> I say that because it's how safety and security work everywhere else - they're created and guaranteed through legal liability.

I don't think the legal enforcement or guarantees are anywhere near as strong as other fields, such as say... actual engineering or the medical field. If a doctor fucks up badly enough they can no longer practice medicine. If a SWE fucks up bad enough they might get fired? But they can certainly keep producing new code and may find a job elsewhere if they are let go. Software isn't a licensed field and so is missing a lot of safety and security checks that licensed fields have.

Reheating already cooked food to sell to the public requires a food handler's card which is already a higher bar than exists in the world of software development. Cybersecurity isn't taken all that serious by seemingly anyone. I wouldn't have nearly as many conversations with my coworkers or clients about potential HIPAA violations if it were.


> Can you cite an example where a company was sued over bad code?

Crowdstrike comes to mind? Quick web search tells me there's a bunch of lawsuits in flight, some aimed at Crowdstrike itself, others just between parties caught in the fallout. Hell, Delta Airlines and Crowdstrike are apparently suing each other over the whole mess.

> The publicly known lawsuits seem to come from data breeches and the large majority of those data breeches are due to non-code lapses in security.

Data breaches don't matter IMO; there rarely if ever is any obvious, real damage to the victims, so unless the stock price is at risk, or data protection authorities in some EU countries start making noises, nobody cares. But the bit about "non-code lapses", that's an important point.

For many reasons, software really sucks at being a product, so as much as possible, it's seen and trades as a service. "Code lapses" and "non-code lapses" are not the units of interest. The vendor you license some SDK from isn't going to promise you the code is flawless - but they do promise you a certain level of support, responsiveness, or service availability, and are incentivized to fulfill it if they want to keep the money flowing.

When I mentioned lawsuits, that was a bit of a shorthand for an illustration. Of course you don't see that many of them happening - lawsuits in the business world are like military actions in international politics; all cooperation ultimately is backed by threat of force, but if that threat has to actually be made good on, it means everyone in the room screwed up real bad.

99% of the time, things get talked out without much noise. Angry e-mails are exchanged, lawyers get CC-d, people get put on planes and send to do some emergency fixing, contractual penalties are brought up. Everyone has an incentive in getting themselves out of trouble, which may or may not involve fixing things, but at least it involves some predictable outcomes. It's not perfect, but nothing is.

> I don't think the legal enforcement or guarantees are anywhere near as strong as other fields, such as say... actual engineering or the medical field. If a doctor fucks up badly enough they can no longer practice medicine. If a SWE fucks up bad enough they might get fired? But they can certainly keep producing new code and may find a job elsewhere if they are let go. Software isn't a licensed field and so is missing a lot of safety and security checks that licensed fields have.

Fair. But then, SWEs aren't usually doing blowtorch surgery on live gas lines. They're usually a part of an organization, which means processes are involved (or the org isn't going to be on the market very long (unless they're a critical defense contractor)).

On the other hand, let's be honest:

> Cybersecurity isn't taken all that serious by seemingly anyone.

Cybersecurity isn't taken all that serious by seemingly anyone, because it mostly isn't a big problem. For most companies, the only real threat is a dip in the stock price, and that's if they're trading. Your random web SaaS isn't really doing anything important, so their cybersecurity lapses don't do any meaningful damage to anyone either. For better or worse, what the system understands is money. Blowing up a gas pipe, or poisoning some people, or wiping some retirement accounts, translates to a lot of $$$. Having your e-mail account pop up on HIBP translates to approximately $0.

The point I'm trying to make is, in the proprietary world, software is an artifact of a mesh of companies, bound together by contracts. Down the link flows software, up the link flows liability. In between there's a lot of people whose main concern is to keep their jobs. It's not perfect, and corporate world is really good at shifting liability around, but it's doing the job.

In this world, FLOSS is a terminating node. FLOSS authors have no actual skin in the game - they're releasing their code for free and disclaiming responsibility. So while "given enough eyeballs, all bugs are shallow", most of those eyes belong to volunteers. FLOSS security relies on good will and care of individuals. Proprietary security relies on individual self-preservation - but you have to be in a position to threaten the provider to benefit from it.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: