Hacker News new | past | comments | ask | show | jobs | submit login

Nope, there's no law that says you can't do independent vulnerability research.



Is publicly revealing vulnerabilities/exploits that can damage a competitor considered part of what you're referring to as "research"?


It is considered that, because that is what it is. There is no law dictating how (or why) vulnerabilities are disclosed, and the disclosure of vulnerabilities is a public service.


Has there been any court case where this interpretation has been sufficient defense?


Thousands of vulnerabilities are disclosed every year. Nobody has ever been successfully sued. The burden is on your argument, not mine.


Yeah I understand that part. But to my understanding this is just due to a lack of clear court rulings on this (cases often settle/drop before court?), not due to the law being interpreted explicitly in favor of this by a court. e.g., https://securityboulevard.com/2021/07/what-the-van-buren-cas...

So if a case came up against Google, I imagine they would very much prefer to have this available as a defense, and draw analogies to the real world if necessary (like the home trespassing example in the link above).


You haven't even presented a theory of law that would make this work unlawful. It can't be on me to come up with such a thing just to knock it down.


> You haven't even presented a theory of law that would make this work unlawful.

I did in fact link to 2 entire pages of what might apply, based on my layman understanding:

- https://news.ycombinator.com/item?id=30310902 (which is one potential "theory of law" that might apply)

- https://news.ycombinator.com/item?id=30316448 (a lot of actual cases against actual individuals, each based on different legal theories)

Obviously I don't know if any of theory would make it unlawful (again, I'm not a judge or a lawyer). I just know security researchers have been sued in the past, and so far it seems to me that they have been either (a) settled out of court, (b) dropped, or (c) been scoped too narrowly to set much of a general precedent.

You don't have to feel compelled to knock anything down if you don't know; I don't really expect anyone to know at this point to be honest. (The second website I linked to also mentions this dearth of court rulings.)


Every case in the article you cited involved someone conducting "research" on computers they did not themselves own, as happens when you portscan a remote host, or look for XSS vulnerabilities on someone's SAAS app, or try to pentest the media system on an airliner.

Project Zero doesn't do any of this kind of research.

Nobody is going to be able to sue Project Zero for finding iOS bugs. You have an almost unlimited right to conduct security research on a phone you buy, or a piece of software you install based on a click-through license.

What you need to be very careful about is, again, testing other people's computing devices. There, you have almost no rights at all (save for services that publicly waive their own rights by standing up bounty programs --- and, don't be confused, Project Zero doesn't depend on Apple's bounty programs to conduct iOS research).

These distinctions are super-clear to people who actually work in this field, but clearly unclear to people outside it, because we end up having the same picky debates about them every time vulnerability research comes up. I get it, it looks fuzzy on the outside. But it is not fuzzy to practitioners; the rules you have to be aware of to conduct research are actually fairly straightforward. Don't mess with other people's machines.


Thanks for clarifying that.


You've got things reversed here: what would you sue someone for that anyone would need to mount a defense?


Well, Sony tried to sue me for disclosing a vulnerability in the PS3. This is what they claimed:

> Violations of the Digital Millennium Copyright Act; violations of the Computer Fraud and Abuse Act; contributory copyright infringement; violations of the California Comprehensive Computer Data Access and Fraud Act; breach of contract; tortious interference with contractual relations; common law misappropriation; and trespass.

Yes. Trespass.

So yes, companies have tried to sue over disclosure of security vulnerabilities in the past. In this one they even ended up settling with one of the other defendants (whom they may have had a bit more of a case against, thanks to the DMCA if nothing else), but I think they realized they had no case against me and most of the others and dropped the lawsuit. They still filed it, though, and I had to get a lawyer, which was not a fun few months.


Journalists have tried to put stories together on vulnerability researchers being threatened (and, hey, I'll raise my hand here: I've been threatened several times). But if you look at the actual instances where things have gotten far enough along to report, the fact patterns break down. It tends to turn out that people are getting threatened for:

* "Researching" serverside apps --- software running on computers the researcher doesn't own --- which is widely understood to fall afoul of CFAA and categorically isn't the kind of work P0 does.

* Breaching contracts, which happens commonly when vuln research firms take on pentest vendor assessment contracts for companies considering purchases, where the pentester access to the target was explicitly arranged under NDA.

* Stuff that isn't vulnerability research under any sane definition, as when people find open S3 buckets, grab all the files off them, and then try to "conduct research" based on the contents of the stolen files.

None of this is at play in the kind of work P0 does, and there are basically no modern stories about straight vulnerability research done under P0 terms where meaningful legal threats have been made. There was a time around the turn of the last century where it was briefly believed that the DMCA might be wielded against vuln researchers, but that didn't pan out.


Except none of the categories you describe apply to what I did, and I got sued. And indeed, the only case they could plausibly have in the scenario I was part of is under the DMCA, because although we did it to run Linux, security vulnerabilities in game consoles can also be used to pirate games.


I don't know, I'm not a lawyer. But I imagine it wouldn't be hard to find some kind of vague basis to sue someone who intentionally causes you harm. Quick Googling suggests business torts are a thing; maybe there's more beyond that: https://www.findlaw.com/smallbusiness/business-laws-and-regu...

And if one has accepted a license agreement to use the product, there's often breach of contract available as a possible basis too.

Are you saying no one has ever been sued over publishing vulnerabilities in competitors' products?




Consider applying for YC's Spring batch! Applications are open till Feb 11.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: