Is it really a terrible idea to restrict your software from being used to control weaponry? To spread lies about science, like climate change denial or vaccine lies? To host child pornography sharing?
These above scenarios amount to handing criminals and bad actors powerful technological tools which allow them to increase harms done to society. This is a software company's choice, there is no law that states a company must provide its software to whomever, even if they have evil intent.
I imagine an autonomous sentry-turret (Half-Life, Portal, etc) comes to mind. Obviously that’s right-out.
But what about airsoft or paintball? Those are recreational activities, but in some jurisdictions their respective “guns” can be legally considered weapons because they can still do serious bodily harm - as well as intimidate.
———
Another example is “Windows for Warships”: you could build this software for that, no? So, is a warship a weapon? Naval frigate? Merchant marine vessel that carries small-arms for self-defense? At what point does arbitrary hardware attached to the computer running this software become a “weapon” or not?
I’m not a rights-absolutist - I just think when writing a license with restrictions that are going to be open to interpretation get a gosh-darned lawyer involved and be as specific as possible, and then some. You don’t want a project getting bogged-down by internet nerds complaining about stuff like this. Not least an army of trolls who will spam your inbox with hair-splitting questions about the classification of firearms technology to gleefully waste your time. (“it’s not an ‘assault rifle’, it’s a ‘scary looking semi-automatic AR-15’”, “bump-stocks aren’t automatics lololol”, etc)
If activists can define "infrastructure" as social programs, they can define "weaponry" as end-to-end encryption, cryptocurrency, or just websites that express perspectives that they dislike. We all lose when people play language games like this.
The issue with a "weird" license like this is that anyone other than individuals will be afraid to touch it. This includes corporations, universities and governments. Existing licenses like MIT or BSD are well-understood and don't need to go through legal. Strange clauses turn into lots of hours and lead time with legal making sure they won't be an issue.
>Is it really a terrible idea to restrict your software from being used to control weaponry?
Seems reasonable until you do something unrelated to offend the licensor and they decide to leverage a very loose definition of weaponry/munitions against you. For example, have a look at what is consider munitions for the purpose of export controls in the USA, and consider what additional items have been regulated as munitions in the past:
> Is it really a terrible idea to restrict your software from being used to control weaponry? To spread lies about science, like climate change denial or vaccine lies?
No, but there's a terrible slippery slope to have the creator able to make value judgments about uses of the software after the software is adopted and exert power over the users this way.
> To host child pornography sharing?
I think it's a bit of hyperbole once we're extending this value judgment to conduct that is already otherwise illegal. Is someone breaking the law to host child pornography and facing criminal charges going to be deterred by the possibility of civil damages from license noncompliance?
> Is it really a terrible idea to restrict your software from being used [to do all sorts of things]
Possibly. If the list of restrictions is ill-defined and ever-changing then it opens up a can of worms, which in practise means no-one will use the software.
To spread lies about science, like climate change denial or vaccine lies?
Of course it’s only a lie when your in-group defines it as such.
I recall at the beginning of the pandemic the CDC and surgeon general saying “masks weren’t necessary” and that “the lab leak theory is a conspiracy”. Of course both proved to be Noble Lies put out by the institutions.
I know it’s comforting to think of the world in terms of good/evil and truth/lies, but unfortunately it’s a messy complicated world and things are not as clear cut as we’d like.
These above scenarios amount to handing criminals and bad actors powerful technological tools which allow them to increase harms done to society. This is a software company's choice, there is no law that states a company must provide its software to whomever, even if they have evil intent.