Hacker News new | past | comments | ask | show | jobs | submit login

Responsible disclosure is one of the worst things to happen to the information security industry, and the sale of exploits is similarly terrible. Modern-day white hats have effectively turned into 'green hats' whose goals are no longer compatible with the previously-held ethical hacking belief system.

Personally I feel that the sale of exploits should be outlawed, or at least specifically to customers have no intentions of fixing the exploit (buying a hackers silence), or to governments that are intent on conducting clandestine/offensive operations.

I also agree with the EFF's stance on exploit sales, and that is that they should be front-and-center in any and all national cyber-security debates:

https://www.eff.org/deeplinks/2012/03/zero-day-exploit-sales...

Also on Schneier.com: http://www.schneier.com/blog/archives/2012/06/the_vulnerabil...

No I'm not arguing for zero disclosure or that security researchers shouldn't be compensated for their hard work, I just think that the people and organizations that sell these exploits are effectively complicit in the destruction resulting from these exploits.

The world of information security is slowly but surely moving towards a Cold War scenario where nation-states are fostering sustained and escalating tensions, especially where they maintain a stockpile of exploits to be deployed in war-time scenarios.




I don't think exploit sales could ever be made illegal in the US. Courts have already ruled that code is speech [1], and the Constitution puts restrictions on how the government can limit speech. First 0-day exploits are illegal, then The New York Times.

Exploit sales are basically a byproduct of living in a free society. If you want them to go away, find the exploits yourself and post them to full-disclosure. Or pay someone to.

[1] http://en.wikipedia.org/wiki/Bernstein_v._United_States


This is the best argument against making them illegal, in my opinion. If you believe source code, code implementing cryptography, and privacy software like Tor and OTR is speech, then you can't set a double-standard just because you don't like what exploits may or may not be sold to do.


Publishing exploits should certainly be protected as free speech, but the sale of exploits to a party with ill intent (governments included, even ours) is moving into the realm of arms dealing because that exploit is going to be weaponized. Intent matters.

If there is a transaction with weaponization as the intent => arms dealing

If it is published to edify => free speech


By that logic, having a perl RSA implementation in your signature is arms dealing because the intent is for that speech to spread to others to weaponize it into encryption software. Reality already won that battle, let's not fight it again.


Can you legally sell exploits of physical systems? Say, a book containing instructions for breaking into any military installation?

Maybe it would be enough if an international treaty required all nations, and their intelligence and law enforcement arms, to abstain from using software exploits, and requiring them to disclose any exploit information they acquire to system vendors for prompt correction.


Probably.

A more recent example: can you sell the NSA's confidential slides? The Washington Post apparently can.


We already make certain practical exceptions to free speech that take intent into account: slander, blackmail, death threats, shouting fire in a theater, etc.

Though I'm dubious about creating (more) wedges that could erode constitutional rights, outlawing only the selling of exploits seems reasonable. Whereas if you tell the world about a zero day exploit for free, it falls back under free speech.


Using the exploits is already illegal.

The examples you list are really not related to sharing of ideas. The criminality of the above acts centers around the harm caused, not the actual act of speech. The problem with shouting fire in a crowded theater is not that you vocalized the word "fire", it's that you caused unnecessary panic. If you had pulled the fire alarm, the same panic would have arisen, but without any speech involved. So it's clear that just because you used your body's built-in fire alarm to cause false panic, and the Constitution protects you from government intervention in most uses of your body's built-in fire alarm (speech), you're not exempt from the consequences of inciting panic. Blackmail and death threats are pretty much the same thing as shouting fire in a crowded theater. Society has an obligation to protect its members from harm, and when you give an "early warning" that you intend to harm someone, it makes sense for society to use that information to intervene in advance of that harm actually occurring.

Slander and libel are very tricky, mostly involving civil penalties rather than criminal penalties. If you slander someone and it causes them no damages, the government is not going to throw you in prison. I've never liked the slander/libel exceptions and did a bit of reading; Wikipedia's article on the subject says: "In a 2012 ruling on a complaint filed by a broadcaster who had been imprisoned for violating Philippine libel law, the United Nations Commission on Human Rights held that the criminalization of libel violates freedom of expression and is inconsistent with Article 19 of the International Covenant on Civil and Political Rights."

I think this is basically the right idea. Look how much time legitimate authors spend defending themselves from libel claims in the UK. Spreading this madness to the US for the perceived benefit of not being able to tell someone how to get a computer program to write to unallocated memory seems pretty stupid to me.

I wonder what happens today when you sell an exploit to someone that then uses it to cause significant monetary damage. If we're being consistent in the application of our laws, it will be the same thing that happens to the company that manufactures the weapons used in school shootings.


You could criminalize parts of the transaction without criminalizing speech. For example, there's no Constitutional right to buy someone's silence, so you could make it illegal to give someone money on the condition that they not disclose security exploits to other parties. That would preserve the freedom to disclose exploits, the freedom to disclose exploits for pay, and the freedom to be silent, taking away only the freedom to buy someone else's silence about software defects (and implicitly the right to take money in exchange for silence about software defects.)


Pay in installments, installments ends when big is found, whether reveled by creator or other.


Let's play devil's advocate for a moment.

Why should selling exploits be illegal?

If I spend many weeks looking for an exploit and I find one, what do I get by disclosing it, to say, Google? The Chrome project does pay a little, but I could probably get more in a bidding war. If Google wants the exploit, it should be able to outbid the other people who are willing to buy it.

Time is money after all, and by disclosing an exploit I worked so hard to find I'm basically giving it all away to Google... and for what? For ethical reasons? What ethical reasons? Google is not a person, it's a corporation.


People use Chrome, that is the ethical issue. As long as we're not pretending that we're ethical hackers then that's fine with me. I find it extremely hypocritical to claim to be an ethical hacker and sell exploits at the same time.


Well, people look for exploits in popular software, so pretty much anything that is being exploited is being used by people. It's Google's job to secure Chrome, not the people out there that are looking for exploits.

Why would you work for free (or almost for free)? Do you not need to pay for food & shelter? Frankly, I don't see why finding an exploit and then selling it to someone is unethical. (By the way, you can replace Google by Microsoft or your other favorite corporation)


I swear I'm not trying to be rude or snarky here, but do you also believe that manufacturing and selling weapons (guns, ammo, aircraft carriers, whatnot) should be illegal?


If you take that analogy seriously, a lot of current exploit sales would already be illegal, even without restricting weapons sales further (which I personally think should be done, but that's another discussion).

For example, you cannot generally export weapons to another country without a license. In the U.S. you cannot even sell weapons across state lines without a license. Many countries also restrict the sale of even low-level devices likely to be used for sabotage, e.g. molotov cocktails, and that may extend to marginal sabotage devices like spray paint as well. Burglary tools are another example of a restricted device that has some analogies.

Lots of what the "security industry" does would land them in jail in any other field, even if it were selling low-level burglary devices to car thieves. Whether it's a good thing it doesn't, I'm not sure. The people taking money for exploits in a "don't ask what they'll do with it" kind of arrangement have a pretty dark shade of gray on their hats.


Right, so why is nobody instead saying that exploit sales should be regulated? Why are most people drawing such stark distinctions between exploit weapons and meatspace weapons?


Meatspace weapons are easy to detect since most/all modes of transportation between nations is monitored and recorded, meanwhile exploit weapons are quite impossible to manage in the same fashion.

Decentralized and obfuscated means of communication make it easy to engage in this behavior at scale, for example Tor and Bitcoin.

Meatspace weapons of significant destructive capabilities are also large and bulky, too.


That doesn't explain why they need to be completely outlawed rather than more strictly controlled, or some other medium.

Also, if exploits are "quite impossible" to manage like regular weapons(and I agree that is probably the case, see how crypto export controls worked out), it goes both ways - any laws passed outlawing trafficking in weapons will be impossible to enforce.


I agree that they would be impossible to enforce, but there are quite a few nations across the world that have achieved minimal gun ownership and relatively high levels of safety.

They have cultures that emphasize an open and diplomatic dialog with the community, and they support trust and reciprocation between law enforcement and common people. Meanwhile the most heavily armed citizenry see it as an existential necessity, which breeds distrust and fear.

Consider for a moment the simple and logical decision-making processes between two logical individuals of equal threat to each-other but deeply suspicious -- if one advocates disarmament then the other will view it with suspicion. In actuality the danger is non-existent, but (for example) an ignorant third-party under the protection of one would screech at the very idea of deescalation.

My point is that weapons control is as much about perceived safety as it is about defensive and offensive capabilities.


No, absolutely not. There are many instances in history where fear and uncertainty were used as political tools to disarm and otherwise control 'untrustworthy' or 'unreliable' people. I'm sure you can imagine what usually follows that line of thought.

I have a hard time reconciling that with my feelings about information security and I can see how this would contradict my statement above, but I have different feelings about information security and physical security (ie: in information security there is a very fine line between offensive and defensive capabilities). You and anybody else are welcome to convince me otherwise.


Exploits are not weapons, and these are very very strictly regulated.

Do you think you should be able to disclaim any responsibility after you sold information about faulty material used in bridges all over the world to third-parties with unknown intentions?


"...Personally I feel that the sale of exploits should be outlawed...to governments that are intent on conducting clandestine/offensive operations..."

You expect governments...

to outlaw the sale of exploits....

to themselves?


Yes, absolutely. If we're going to shout to the world that we have a moral high-ground then we should put our money where our mouth is, otherwise we're just two-faced crooks and liars.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: