Hacker News new | past | comments | ask | show | jobs | submit login
Tech companies and governments sign up to Christchurch Call agreement (rnz.co.nz)
122 points by _mgr on May 15, 2019 | hide | past | favorite | 321 comments



This just sounds like a giant can of worms that is going to blow up in people's faces. Remove for a moment any potential future government alignments, just so we aren't talking about hypothetical fictitious governments. Let's just examine the governments currently signed to it.

Indonesia has harsh religious laws, crack downs on illegal reporting, and literally raids on LGBTQ gatherings. The Senegal government arbitrarily arrests dissidents, the LGTBQ community has to hide because it's illegal, and protests are outlawed. India already overuses counter terrorism laws to charge dissidents and activists and there are religious minorities that suffer heavily from discrimination.

This type of call to action will only further entrench government strangles over freedom of speech. Now people have given them moral authority to curb an already very broad and ambiguous category of terrorists and now extremists. Sure I get it, there is a bunch of vile on the internet and the world would be a better place without it. But my "better place without it" is different from somebody else's and so is the "it". This won't end up like what is in your head.

The hate isn't "spreading through social media" the hate and the fear were already there. These people grew up with it. Social, cultural, religious, sexual, moral borders, you name it, every border we have is being rewritten and when you rewrite those borders, especially this quickly, people are gonna get scared, they're gonna lash out, and because of it more people are getting scared and want to control one of the most powerful tools to freedom.


>Let's just examine the governments currently signed to it.

The interesting thing is that this really doesn't matter that much in the grand scheme of things.

Consider weapons treaties, like the UN ones banning the use of land mines and cluster munitions. The only countries that have signed them either don't have any reason to use them, or are allied with a nation that hasn't signed that treaty.

This is much the same story. The US hasn't signed this, and never will (because it explicitly contravenes a cornerstone of its supreme law), and at that point what the other countries do is pointless unless they outright block US services from their networks- in which case there will be riots in the streets. Governments don't survive for long when they alienate the vast majority of their population, and the majority of the population uses US services.

Combine that with the simple fact that 100% effective moderation of an online service is unscalable to the point of being impossible without prohibiting any meaningful content/conversation means that countries that do sign this and implement it in their law will never be able to develop a competitive Facebook alternative, and all you've accomplished (as a signatory nation) is political posturing and shooting yourself in the foot.

You can't outcompete a free nation. That's kind of its main advantage.


> Combine that with the simple fact that 100% effective moderation of an online service is unscalable to the point of being impossible without prohibiting any meaningful content/conversation means that countries that do sign this and implement it in their law will never be able to develop a competitive Facebook alternative, and all you've accomplished (as a signatory nation) is political posturing and shooting yourself in the foot.

Hell 4chan trolls moderation with really bad content for sport from time to time. I hate to go there but I think its sometimes necessary to see it happen to prove that laws dont stop people already violating laws from gasp violating laws. You cant moderate them IRL what makes you think you can stop them online?


That's true. I never thought of it that way. I mean I hope the effort would evolve into something more sensible but you don't know with the willingness of brilliant people signing on to bad ideas.


> unless they outright block US services from their networks- in which case there will be riots in the streets.

People aren't going to be rioting in the streets because they lose access to Google and Facebook, come on.

People in western democracies put up with a hell of a lot worse than that from their governments without rioting.

> You can't outcompete a free nation. That's kind of its main advantage.

I'd like to believe that was true, but I think China will prove us wrong in the coming decades.


"You can't outcompete a free nation."

This may or may not be true, but a free nation can collapse on itself. The Weimar republic and modern day Hungary and Turkey are just a few examples.


> The hate isn't "spreading through social media" the hate and the fear were already there.

This is an idea worth examining more closely. In one view, social media acts as simply a passive reflection of an existing culture; in another, social media reproduces that culture which later becomes (in part) a reflection of social media etc. The causality goes both ways.

Organised white supremacists have decided that social media can also be used to help steer society, and it can be -and is- used as a recruitment vector, two things that wouldn't be possible if the 'passive reflection' model was accurate.

For what it's worth, I'm undecided as to whether this particular proposed response is workable or harmful or effective or whether it risks liberties collaterally. But I view that as a separate question from whether social media plays a significant role in white nationalist recruitment.


I definitely agree. I totally understand that they use it. And hell with the amount of demographic information advertisers give, I'd wager you could target down to the individual for your message. I just don't think censorship is the way. If anything you just run them into more private circles where potential recruits won't readily see the other side of the argument.


Seems like a false dichotomy you can both run white supremacists into private circles and provide a counter-narrative that’s readily available. See Islamist radicalisation for example.

The other issue with the concept of winning a war of ideas is that the other side isn’t usually available in the venues espousing white supremacy. They are already private circles in that sense with moderation. So we’re in the situation of hoping the people being actively radicalised by them happen on other side of the argument and haven’t already been poisoned to it. Which seems like wishful thinking.


I believe it's something in between both views: Society is just shifting its operations to social media. This does bring a few changes, but ultimately nothing new is happening, it's just happening somewhere else and becomes easier to identify.


I think the growth of flat-earth and anti-vax believers are two examples that disprove your statement. Before the Internet, the average person simply would not have come into contact with material promoting these ideas.


> I think the growth of flat-earth and anti-vax believers are two examples that disprove your statement

What growth are you talking about? Is there an actual research, that indicates that number of flat-earthers have grown compared to previous decades?

And why do you think, that there is a relationships between flat-earth nuts and social media, as opposed to, say, HIV, cancer and other diseases and social media? Social media made you more aware about flat-earthers than before, but it didn't turn you into one.


Good point.

Then again, the same way people can become aware of the problems with their political system through the internet. I understand this has happened in Syria within the last generation or so.


This is definitely incorrect:

> The hate isn't "spreading through social media" the hate and the fear were already there.

If extremism couldn't be effectively spread, extremists wouldn't bother trying. But they put quite a lot of effort into recruiting and propaganda.

For those interested, I strongly recommend Neiwart's "Alt-America": https://www.amazon.com/dp/B01MT2KCB2/

He's a journalist who in the 80s and 90s was covering extremists (e.g. violent white supremacists, "patriot" militias). He makes a very good case that the Internet has been an enormous boon, enabling significant growth and normalization.


Extremism was spreading long before the internet. We literally had a cold war to try to stop the spread of communism not to mention there are countless examples in history where the populace was swayed by extremism. The nazis didn't have internet but they were perfectly capable of getting people lined up for war. It's definitely not the technology, they will find any method they need to spread their message.


> Extremism was spreading long before the internet.

Yes but not as quickly, cheaply, or effectively. You might as well say "planes do nothing because people were migrating between continents before they existed".


Again, read the book. The Internet made all sorts of communication and organizing easier. Most of that is good. Some of it is terrible.


Don't you think the medium used to communicate ideas determine how fast and far they spread ?

Sure I can go down in the streets, shout my hate speech at the top of my lungs, put propaganda flyers in random mailboxes, &c ... Or I can upload a video to all video hosting websites and reach more people a single day than I could do in a lifetime of organic propaganda.

It's like saying internet didn't reshape the world because we could already send data over long distance before, yeah sure if you consider that a boat going from EU to US with hand written letters is just as good as our current 60 Tbps ocean internet cable.

Also, don't forget that the Nazis were elected to power and were the official government, it's much easier to use propaganda when you're in that kind of position.


> Sure I can go down in the streets, shout my hate speech at the top of my lungs, put propaganda flyers in random mailboxes, &c ... Or I can upload a video to all video hosting websites and reach more people a single day than I could do in a lifetime of organic propaganda.

Because the printing press, newspapers, filmed movies, radio and television never existed? /s

Mass media has been around for a long time.


First off these mediums are mostly controlled by state regulated entities, meaning that it was mostly used for state sponsored propaganda. It's not like I can press 696 on my TV remote and dial on ISIS-TV with live executions.

Second, yes, they did exist, and they also did increase the reach of ideas over what was available before them, so what's your point ? You're basically extending my argument.

Progress in communication technology means increased propaganda / hate speech / whatever reach. Now you have two choices, do nothing, or try to curb down their reach. Should we be ok with live executions on twitch/yt/fb ? You're free to think that we should, but I'd argue most people are against it.


"First off these mediums are mostly controlled by state regulated entities."

Pirate radio, printers, peer to peer sharing, digital formats in general, not controlled by the state and you can still use or access much of that equipment or content. It's not convenient or easy for people, but for the most part it is doable.

"Progress in communication technology means increased propaganda / hate speech / whatever reach. Now you have two choices, do nothing, or try to curb down their reach. Should we be ok with live executions on twitch/yt/fb ? You're free to think that we should, but I'd argue most people are against it."

Those are private entities and they can choose what they want to do on their site. Also, you have distorted the argument as against any type of action against all forms of speech. No one here is saying we should allow streaming of rape, child pornography, murder, etc. Those are already illegal and that problem is already dealt with. The issue we are talking about is allowing people's ideas to be spread.

Already in this thread within a day you see people talking about anti-vaxers and flat-earthers in a debate that is actually supposed to be about extremists and terrorists. But, as these things do, you start pondering the scope and what other bad ideas we shouldn't have let spread or could have been stopped if only we capped our freedoms.

I don't think the state should be in charge of deciding what is and isn't protected speech in this case. Once it is done, when its in the hands of the state, good luck trying to get it undone. It's the edge our understanding. We are more connected than we have ever been before and the complexity of the issue is extremely high. Rushing into the situation and putting a blanket ban on types of speech is a bad idea.


>The hate isn't "spreading through social media" the hate and the fear were already there.

Well, hate and fear maybe we all share. But the idea to express them in terms of killing innocent people is factually "spreading through social media".


That being said, India blocked reddit a few days after the Pakistan attack earlier this year.


>This type of call to action will only further entrench government strangles over freedom of speech.

There is a certain fringe of society which believes that the real reason for the censorship of the Christchurch attack is not because it motivates and inspires further killing, but that it shows he was not a lone wolf and had handlers helping him step through the scene - and the video clearly shows evidence of this.

IF this were true - and I'm not saying it is - then yes, we have reason to be concerned about being lied to in this fashion. If the general public is never able to tell the truth about these kinds of incidents because they are wrapped up in even more dire secrecy, then the incidents themselves become even more dangerous to society as a whole.

Whether we like it or not, there are very powerful actors and groups out there would seek to profit at every turn from terrorising the general public - they're not all jihadi's or right-wing nutcases, but certainly know how to present themselves - and/or their Manchurian candidate co-criminals - as such.

This sort of duplicity is only going to get worse in a culture of utter secrecy - the only thing that can save the general public from the nefarious deeds of those who would use terror to control our minds, is the light of truth.

This means having the courage to let videos of these events be accessible to the general public. This means, not living in a protective bubble provided to us by higher powers, but rather living the kind of life that will be unhindered by such evidence, when it is presented.

This is a very difficult subject, precisely because people are overly sensitive about the kinds of things they wish to be exposed to. In the same sense that the American public are virulently anti-war when presented with real evidence of the effects of War - e.g. Vietnam - they are violently pro-war when they only see one side of the mighty military-industrial-pharmaceutical complex's pitch deck.

If we want to be ruled by terror, we must merely allow the locks of secrecy to be closed around our eyes. If we want to live in a world where different cultures and different religions, and people with vastly different points of view to our own nevertheless get along, we must never hide from the truth, no matter how traumatising it can be.

This move by the coalition governments is a cynical, duplicitous attempt to make it harder for whistle-blowers and evidence seekers to know the truth about these kinds of attacks, and puts us all in the position of being liable for the lies told to us by those who have the means to pull off such operations - i.e. the military-industrial-pharmaceutical complex.


"Let's just examine the governments currently signed to it."

You picked three. How about Australia, Canada, European Commission, France, Germany, Ireland, Italy, Japan, Jordan, The Netherlands, New Zealand, Norway, Spain, Sweden, United Kingdom. Are you going to examine those? Most of those are the most free, most protective countries.

"The hate isn't "spreading through social media" the hate and the fear were already there."

There has always been hate. There have always been incredibly ignorant beliefs like flat Earth, anti-vaccine, etc. Usually it is isolated and the community -- the village -- keeps it in check and provides a pressure valve. Social media, however, allows us to surround ourselves with a bubble that makes us think that it's normal if not somehow heroic, our graph of similarly extreme believing people cementing our resolve. That guy who became curious and got some bad information isn't being reasoned with by rational people, he's being indoctrinated by a distributed group that only reinforced their own beliefs.

It absolutely has made the problem worse in otherwise reasonable societies.


> It absolutely has made the problem worse in otherwise reasonable societies.

I'm genuinely curious, has it really? I wonder if it's just more widely reported?

I tried looking up terrorism statistics and found this: https://ourworldindata.org/terrorism

For most countries, it doesn't look like much has changed from 1970-2017.

I glanced at hate crime stats on the FBI website (just looking at 2008 and 2017), with 7,783 and 7,175 crimes reported for each year). So not much change there either.

Obviously this is a half-assed attempt, but I just wanted to counter the idea that the world is falling apart. Hopefully someone who has more interest in these things can provide some studies with more comprehensive statistics.


Many people seem unaware of how bad the 70s were, and that definitely predates Facebook.

https://www.npr.org/2015/04/05/396359930/explosive-protests-...

I suspect people just like to have a boogeyman to blame so that they feel they have more understanding or control of the situation than they actually do. "It's a bad situation and we don't know what the cause is or how to fix things" isn't something you'll hear very often, particularly from politicians.


and in the 80s too. anarchist &Nazi&friends in Italy were engaging in an actual bomb marathon with huge loss of innocent life, and I'd argue that if being mean on internet is action enough today for these groups so that they disregard their traditional methods, like literally bombing places, then we are all better off with that


I can never wrap my head around the thought process. Kill innocents and somebody else will be impressed with Anarchism somehow?


Think of it as hostage-taking at mass scale, intending for fear of continued bombings leading to capitulation.


"Are you going to examine those? Most of those are the most free, most protective countries."

I could I suppose go over more of them. The point though, that I was making when I mentioned in exempting fictitious future governments, is that it can be used NOW by real governments. Sure there will be nations that don't abuse it indefinitely, but you can't ensure a governments intent 5 years from now let alone 50. There are plenty of cases I could see in some of those nations where it could be used against the people that may be less black and white, like organization efforts for the yellow vest protests being censored and removed or any peaceful protest that could get out of hand for that matter. Or maybe even Google and Facebook making anti-tech protests less visible because black bloc may show up.

"There have always been incredibly ignorant beliefs like flat Earth, anti-vaccine, etc. Usually it is isolated and the community -- the village -- keeps it in check and provides a pressure valve."

Well for one the idea of a "non geocentric universe" literally had a man burned at the stake for blasphemy, people were burned as witches, religious wars were all over the planet, so the problem isn't technology, it's humans. And I'd argue the only thing that has made us more tolerant is the freedom of information, the freedom of speech, and the exposure to ideas. Debate changes minds, telling them to stay hermits doesn't. Not only does it not change minds though, as others have pointed out, in this age, the intolerant will spread their message somehow some way.

I would also argue, that the bubble you are talking about doesn't have much to do with the ability to connect and instead has more to do with the way social media has designed their platforms. They are just like casinos. Just look at twitter. There is no down thumb, only likes and views. So you don't get the "oh you lose" social cue, you only get the flashy bling of increasing likes and views. In normal society you get a disgusted look or ridiculing laughs to tell you if you are saying something unacceptable in public. But that's not how "social" media works. And I think that specific point is what needs to be worked on. So I think we might have a common agreement in that at least. But I still think censoring is the wrong way to go. Just let the world see the true social acceptance score and I think you will see a lot of that isolation of bad ideas.


"...is that it can be used NOW by real governments." "They are just like casinos. Just look at twitter."

Two good, but arguably separate points. I'd say that the internet generally (besides the .0001% of it that is of bonafide intellectual content) has gone down the same road as slot machine design and they're getting better at it.

The main point (and threat) of agreements like this is that they set up a legal framework and the physical ability to control what people see from a central switch. Combine that with the ever-improving ability to both addict people to pictures on screens and to nudge them in a desired direction, it's really all about power.

I suppose in the final analysis it's all just takes the place of state sponsored religions.


But for your example it isn't needed whatsoever. They do what they want without it. Denigrating a good initiative by association with a small number of suspect participants is not convincing.

"Or maybe even Google and Facebook making anti-tech protests less visible because black bloc may show up."

Google and Facebook can do anything and everything they want right now. And we know that they, among others, bring forward the most contentious and the most divisive because it draws engagement. We know that they immerse people in their own filter bubble where suddenly they live in a world where seemingly everyone is a flat Earther, or believes in Pizzagate, or whatever.

"so the problem isn't technology, it's humans"

This is specious. In a period of ignorance, ignorance reigned supreme (not to mention a theocracy, which is dangerously close to re-emerging in the US). We now, at least in some realms, have an ability to reason and to cite and use fact.


"They do what they want without it."

That argument goes both ways, if they can do what they want now, then there is no need to sign it and it's just a fluff feel good piece of paper. But the fact is, there is power behind people, governments, and organizations throwing in public support.

"This is specious. In a period of ignorance, ignorance reigned supreme (not to mention a theocracy, which is dangerously close to re-emerging in the US). We now, at least in some realms, have an ability to reason and to cite and use fact."

This is literally the point I'm trying to make. The hate is ignorance, and the only way to stop that ignorance is to allow the flow and freedom of information and the exchange of ideas. That's the only way we are going to be able to change any minds.


>So you don't get the "oh you lose" social cue [on twitter]

You don't get the silent downvoting that occurs on sites like reddit, but you do get overt disagreement which can go viral and result in consequences in real life such as being fired and harassed. Added to that, twitter is very proactive in banning people. My point is that the absence of one form of feedback does not preclude the use of other kinds.


That's a strawman, plus nazis/white nationalists are groups that actually spread anti-LGBTQ propaganda


Sorry, that’s not what a straw man fallacy is. You may not agree that the danger of censorship misuse is high but it’s not fallacious.


Really? A lot of the alt-right seems pro-LGBTQ to me.


Are you suggesting that certain countries would describe LGBTQ activities as "terrorist and violent extremist content"? I don't think they would need to, they have special anti-LGBTQ laws already.


Given that Saudi Arabia considers atheism to be a form of terrorism already, I don't see it as particularly far fetched. The reason to do so is because it allows making misleading claims for propaganda purposes.


I suppose it's a familiar strategy, pass draconian laws against terrorism, then expand the definition of "terrorism" to anything you don't like.

https://www.nytimes.com/2019/05/13/us/politics/georgia-offic...


Correct and now they have the moral authority and excuse to continue doing it. That sort of support, especially when you can say from a podium "The world is behind us in this effort" is a powerful tool for governments to use. Imagine being gay or lesbian and hearing you might as well give up because the whole world is against you.


Well, at least Russia routinely bans LGBTQ-related websites for extremist content.


This is a very hard problem that YouTube and Facebook made for themselves by becoming the world’s largest advertising platforms. They depend on engagement for ad revenue, they designed world-class algorithms to promote this engagement, and it turned out that extremist content happens to be very engaging.

And so the problem is, building an algorithm that blindly promotes whatever keeps users on the site, for all its complexity, is a far more tractable problem compared to building a system that can avoid promoting content that promotes violence. In the meantime, they throw armies of people at the problem, to moderate content and respond to user reports, but it’s a losing battle.

They had the technology to create a monster, but don’t have the technology to stop it.


Except that advertisers are not interested in sponsoring extremist content, so you could argue the incentives go in the other direction.


YouTube permits a lot of demonetized videos they don't dare to sell advertising on, or (previously) that YouTubers wouldn't give permission to sell advertising on, because the alternative of erasing the videos would send the YouTubers and their viewers to DailyMotion or Vimeo. As long as the demonetized viewers occasionally watch a non-demonetized video, or the sometimes-demonetized YouTubers occasionally make non-demonetized videos, YouTube makes more money that way.


That's a weaker incentive, though. And it comes with a risk of occasional scandals that scare away advertisers, at least temporarily.

I don't think incentives are enough to predict what YouTube will do. The future is uncertain, there are a lot of policies that could plausibly make money, and the decision-makers are people who read the news and have opinions about what's right, not paperclip maximizers. Politics matters as well as economics.


True!


> They had the technology to create a monster, but don’t have the technology to stop it.

They don't have the incentive to stop it.

All you have to do is make posting have an actual price, no matter how nominal, and most of this goes away.

Unfortunately for their profits, it will wipe out 90+% of their user base. And it will kill virality cold (you won't forward something to your whole address book if it costs you a dollar).


The other incentive option is to charge the platform for the violation. Aka, remove Section 230. If the platform was to be charged for posting dangerous content, the platform would magically be able to largely prevent it overnight.


>Aka, remove Section 230. If the platform was to be charged for posting dangerous content, the platform would magically be able to largely prevent it overnight.

That's not actually what Section 230 does. It prevents the platform from being charged if while moderating it makes a mistake.

Removing Section 230 would do more harm than good if you consider censorship good- because then the only option to escape legal liability would be to allow everything. Remember that Section 230 was part of the Communications Decency Act and was designed to ensure that websites could remove pornographic and objectionable content without being responsible (of course, it's also the only part of that law that was found constitutional, and for good reason).

By taking the only legal immunity for half-baked censorship away (i.e. private company gets to moderate what's posted but won't suffer consequences for getting it wrong) you've closed the door even harder on it- US law won't actually let you censor consequence-free otherwise, which is the whole reason S.230 was added in the first place.


Charge them with what? In the US there is nothing illegal about being a Nazi. There is nothing illegal in selling Mein Kampf etc.


please don't compare selling a book with being a nazi.


Nor compare reading a particular book makes one a Nazi.


hum.. that's exactly the point of not comparing both things.

Reading a book, as bad as it is, start a well informed debate, that might improve the world. You can't critique or remedy something you doesn't know exists.

While hating a particular group of people (because you read the book, or didn't pay attention in life, or any other reason) does not improve anything.


Suppose you have two video-sharing platforms which use basically the same recommendation algorithms (etc.) to keep people engaged and clicking on videos, and they're equally effective — except that one of them censors recommendations based on some political criteria, for example, being "extremist", such as whoever the modern equivalent of Martin Luther King is. Let's call the censored platform "JedgarTube" and the other one "MLKTube", for lack of better terms. There are a couple of possibilities:

1. Those recommendations were actually less effective at keeping people engaged than whatever recommendations replace them on JedgarTube. In that case, MLKTube needs to copy the decision in order not to lose users gradually to JedgarTube. In fact, whichever platform blocks those recommendations first will experience improved user growth and engagement. Essentially the recommendations were just a bug of a primitive recommendation algorithm.

2. Those recommendations were actually more effective at keeping people engaged than whatever recommendations replace them on JedgarTube. In that case, JedgarTube will gradually lose users to MLKTube, again, assuming the platforms are otherwise equal.

Of course, the truth is that any particular censorship decision could fall into either #1 or #2. The #1 censorship decisions will be copied by MLKTube, if they aren't too incompetent or principled, while the #2 decisions will gradually accumulate into a competitive disadvantage for eyeballs at JedgarTube.

That's why the media platforms and government censors are trying to set up a global censorship system — whichever platform steps up first to be JedgarTube will lose viewers to whoever's censorship implementation is a step or two behind.

Note that none of this logic depends on normative judgments such as "extremist content is bad", "extremist content is good", "people should have freedom of speech", "people shouldn't have freedom of speech", "platforms shouldn't manipulate people with algorithmic recommendations", "algorithmic recommendations are good for people", or anything like that. It's purely reasoning about objective causes and effects about people's behavior, although I've chosen my terms to weaken the evident bias against "extremism" in this discussion so readers can reason about these causes and effects instead of being thrown around by their emotional biases.


That's why the media platforms and government censors are trying to set up a global censorship system — whichever platform steps up first to be JedgarTube will lose viewers to whoever's censorship implementation is a step or two behind.

The market isn't going to do what we want, so we're going to implement it using centralized power instead.

I didn't elect these big companies to censor the public discourse, and it's highly disturbing that they're working hand in hand with the government to do this.

Note that none of this logic depends on normative judgments

The "logic" depends on the normative judgement that what would result from a free market would be bad. There have been too few big players doing too much meddling and manipulation of the markets contained within their walled gardens to know whether that would be the case or not.


I didn't elect these big companies to censor the public discourse

Well, there are things you can do about it, but I didn't want to get into that when describing the incentives that make it difficult or impossible for them to censor the public discourse unilaterally, but possible to do as a cartel. I think it's important for people to have a solid factual understanding of what's going on in order for their normative judgments, and the plans they make based on those judgments, to be well-founded.

The "logic" depends on the normative judgement that what would result from a free market would be bad

No, none of the cause-and-effect relationships I described depend on anything being bad. They function in exactly the same way regardless of whether that would be good or bad.


Well, there are things you can do about it, but I didn't want to get into that when describing the incentives that make it difficult or impossible for them to censor the public discourse unilaterally, but possible to do as a cartel.

If you are acknowledging that a cartel in collusion with the government is acting to censor everyone's speech, that's at least a start.

No, none of the cause-and-effect relationships I described depend on anything being bad. They function in exactly the same way regardless of whether that would be good or bad.

The sneaky conceit is that you've constructed a causal scenario that presupposes the undesirability of a supposedly inevitably caused future. If we had more commerce, more free speech, more sharing of culture, and more cultural change, we would have less extremism. History shows us this quite clearly. It's when big, centralized powers start mucking about with the lives of individuals, that extremism rears its ugly head and becomes a problem.


What leads you to believe that I presuppose the undesirability of people supposedly inevitably having access to Martin Luther King? Perhaps you aren't from the US, so you don't understand the cultural context of my example of extremism? Or did you mean that I am claiming that the formation of a censorship cartel is supposedly inevitable? I wasn't claiming that; I was claiming that in the absence of a censorship cartel, people will tend to move to platforms whose recommendations aren't censored, and so they will have access to Martin Luther King and other extremist content.


I presuppose the undesirability of people supposedly inevitably having access to Martin Luther King?

I'm tiring of your willful redirection of referents, which I think is the point of your construction. Your "logic" (that basically amounts to censorship is necessary because it's inevitable that bad people will win) smacks of the same mental gymnastics people used to "prove" the existence of god.

people will tend to move to platforms whose recommendations aren't censored

Yes, and this will eventually result in MLK (whose positions are now centrist) winning and authoritarians on both extremes losing. Your scheme uses a lot of words to attempt a pretty transparent sleight of hand.


Again, I am not arguing that censorship is "necessary", nor am I saying anything about bad people. I am saying that individual websites are not in a position to impose censorship unilaterally in a very effective way, and those who attempt it will lose users to those that do not, but that a cartel of a sufficiently large group of popular websites is in a position to impose censorship. This is true entirely independent of whether censorship is good, bad, or mixed. Furthermore, it doesn't imply that censorship is either inevitable or impossible, since the formation and continuation of such a cartel is an uncertain contingency upon which effective censorship is conditioned.

As to whether MLK's positions now being "centrist", I suspect you're one of those people who haven't read anything he wrote other than "I Have a Dream", and haven't even read more than excerpts from that, but all of that is irrelevant to the argument, because in my original post, I was talking about today's equivalent to MLK, who is by definition not centrist.

Your accusations of dishonesty — "You[]…attempt a pretty transparent sleight of hand" — are false and unfounded, and you need to withdraw them immediately.


> They had the technology to create a monster, but don’t have the technology to stop it.

At first I was going to argue that they could stop it and that they did have the "technology" to do so, but the more I thought about it the more I concluded that in the system we have built, they really cannot. Profitable ideas are unstoppable until they are proved less profitable than some other idea, or regulated out of existence. Fiduciary duty enshrines this into law. Sufficient competition ensures that it is the only winning strategy.

Obligatory SSC: https://slatestarcodex.com/2014/07/30/meditations-on-moloch/


> Fiduciary duty enshrines this into law.

A common misconception: https://www.nytimes.com/roomfordebate/2015/04/16/what-are-co...


Facebook doesn't have fiduciary duties to anyone, unless they're registered financial advisors.


It's amazing to me how political this is and how oblivious to that fact a large portion of the NZ population is.

This horrible act happened on the current governments watch. I've seen more outrage and effort from the government on the spreading of the video and manifesto than introspection into how this slipped through in the first place. I suppose in a country where carrying a weapon for the purpose of self defense is considered a crime, something this terrible shattering the illusion of nanny government protecting you requires a whole lot of deflection and ultra maneuvers to secure the next election cycle.

New Zealand's knee has jerked so hard I'm feeling it in my groin 8k miles away.


NZers have observed the USA's experiment with reducing violent crime by flooding the nation with guns, and there is very strong support across the political spectrum for trying something else: restricting access to the most dangerous guns, and deplatforming murder advocates. It is not some authoritarian ploy.


> It is not some authoritarian ploy.

Regardless of whether you think it's a good idea or not, restricting access to guns and free speech is an authoritarian play: it's using the authority of the State to restrict certain liberties in the intent of doing good.


> it's using the authority of the State to restrict certain liberties in the intent of doing good.

That's called civilisation.

And the US doesn't have absolute free speech. It never did.

Just like other democracies, it has restrictions on speech for various reasons, some good and some bad. It's about where an individual country draws their line on the spectrum:

https://en.wikipedia.org/wiki/United_States_free_speech_exce...

Some US states have obscenity laws that other developed democracies don't. Obscenity laws. Just think about how insane that is as a concept.


All laws "use the authority of the State to restrict certain liberties in the intent of doing good".


Sure, all law is an exercise of authority, and hence authoritarianism. The commenter was merely calling out that... no matter how you dice it... a new law is always authoritarian, whether you acknowledge it or not.

Saying 'this law is not authoritarian' is always a contradiction.


Most of the conversation here is around free speech and censorship. I don't believe in government, in general, limiting the flow of information through censorship and 14 year prison sentences.

As to the USA.. Well, I think it's important to consider that the USA is about 68 times the population of NZ. 68 New Zealands, think about that. Some of those New Zealands are quite safe even relative to other countries, and others are not. Firearm homicide rates, along with other crime rates, fluctuate wildly by region. And yet all those regions have the 1st and 2nd amendment.

IMHO, as a NZ resident, NZ is definitely safer and nicer than the bad parts of the US. However, I do not believe it has too much to do with its loose free speech protections and firearm restrictions. New Zealand is probably sturdy enough to have stronger free speech and personal liberty protections while being as safe or safer as the safest regions in the USA..


The "restricting access to the most dangerous guns" is a buzzword/phrase whose categorization expands the more you apply it.


> Tech companies and governments sign up

Two wolves and a sheep vote on what's for dinner. What's for dinner being our most fundamental rights as citizens.


Only Americans have fundamental rights.

Everyone else has "hate speech," which depending on how that's interpreted, can mean anything from thinly veiled blasphemy laws, an inability to bluntly criticize Scientology or Islam, or charges for making Nazi jokes.


Preventing viral spreading of the videos I think is unquestionably ideal, but there's also a section in this "Call" stating another goal, to "Counter the drivers of terrorism ... to resist ideology and narratives... through education and building media literacy..." [some words removed so the message is less hidden]. It's hard not to suspect political motivation, given that Internet platforms are turf wars for politics these days.


Imagine the next "favorite" presidential candidate that has $1.4 billion in corporate funding. But late in the game it seems like the people may actually vote for her opponent. What is this corporate money to do? Why, implicate her opponent's followers in some kind of "terrorist" act and get their forum taken down due to the "call" and "lessons learned from the tragedy"


The Christchurch shooter amassed a cache of weapons, and also posted a copy of his manifesto and a link to his real Facebook account to 4chan.

The censorship/Facebook algorithms amplifying abhorrent content debate is one thing but I'm surprised by the lack of scrutiny of the security services over this. Especially for a member of the 5 Eyes. I can't help but feel this could have been prevented without any of the changes being proposed.


4Chan and LiveLeak have both been blocked (via DNS) in Australia and New Zealand. So, the scrutiny albeit region specific has occurred.


Just to expand on this a little bit, it is contingent on ISP and as far as I know, the following websites:

- voat.co

- 4chan.org

- 8ch.net

- liveleak.com

- archive.is

- bitchute.com

- zerohedge.com

- kiwifarms.net

I think I'm right in saying that Telstra, Optus and Vodafone are the 'Big Three', and they have blocked the above.

Here in NZ, It's Vodafone, Spark and 2 Degrees, all of whom, I understand, blocked access, though I've been unable to verify this first-hand.

There are also hefty prison sentences [0] (up to 14 years) and fines for people who read/distributed the manifesto and watched/shared the original footage.

Edit: More comprehensive block-list can be found here: [1]

[0] - https://www.nytimes.com/2019/03/21/world/asia/new-zealand-at...

[1] - https://www.citizensagainstidiocracy.com/93/internet-censors...


Wait, are you saying that watching the video is considered a crime in and of itself? Not redistribution, and not even mere possession, but just seeing it?


Evidently you can't even read the manifesto (which involves possession of course. Dunno if you get in trouble for reading it over someone's shoulder).

https://qz.com/1579660/new-zealands-manifesto-ban-explained-...

According to that article, having a copy is worth 10 years in the pokey.


Well that makes me want to go read the damn thing. Good job NZ. This is why you get left off maps.


Certainly, like child porn, watching snuff movies was already illegal in NZ before the alt-right terrorist attack, all the NZ censor did was confirm that the snuff movie fell under that category, it was essentially born illegal. No one passed any special new laws to make it so.

It also falls under the US supreme court test for obscenity and is equally illegal in the US


>It also falls under the US supreme court test for obscenity and is equally illegal in the US

I don't think that's correct.

There are no federal obscenity laws. The U.S. government does not expressly prohibit obscene conduct. In fact, the U.S. government expressly protects some communications in the First Amendment to the U.S. Constitution.

https://legalcareerpath.com/obscenity-law/


18 USC 71 ... is the federal obscenity law, it's been much patched in and around child porn, but the basic law is still there. Of course there's been a lot of Supreme Court rulings in and around it culminating in the Miller test ... Your 'shooting video' is really a snuff film, probably one of the few genuine ones (along with those made by ISIS, and just as bad) and certainly falls under the Miller test


You can't just redefine a murder video as pornographic because you don't like it. You mentioned the miller test, so you already know it requires that it depict "sexual conduct or excretory functions specifically defined by applicable state law".

Ignoring the state law requirement (which you haven't cited), the video does not depict sexual conduct.. and there's no way you can twist this into being illegal in the US. It's not illegal. Period. ISIS videos aren't illegal either.

You've clearly misunderstood our laws.


18 USC 71 is about 'obscenity', which doesn't just include pornography. These days it includes a bunch of clauses about child porn, but that's mostly about politicians wanting to get their names on the board. The original, base law is more general and its definition comes from common law modulated by many supreme court decisions


I mean, /r/watchpeopledie was active for nearly a decade and never had legal issues.. I'm not sure this is enforced in any meaningful way.


[flagged]


Wow, that's an incredibly judgey and myopic perspective. And wrong, I believe.

I've personally watched a decent amount of gory stuff just because I like to understand the world I live in better.

Do you think people who watch violent R rated movies are sick fucks too?


There's a difference between watching fatal car accidents or faked scenes in movies as opposed to live criminal acts of violence on real people, but there are certainly grey areas too. Who hasn't seen people jumping from the World Trade Center fires?

We're drawn to, and learn from, the plight of others, but there are also legal standards to protect real people who are being filmed while being criminally abused, rather than just acting the part in a video portrayal of such.


>It also falls under the US supreme court test for obscenity and is equally illegal in the US

I fail to see how a shooting video appeals to the prurient interest in any way.


Hah, good luck enforcing that. Besides, most people that wanted to already saw the video and read the manifesto because they had proxies setup to get access to US video streaming catalogs.


zerohedge.com is not blocked on telstra (Australian) network as of yesterday (when I last visited).

Don't know about the other sites.


I don't think you understood my point. Blocking websites is not scrutiny of the intelligence services.

I'm saying an individual with weapons doxed himself and telegraphed his intentions on a public forum and the security services completely missed it.

That could all have been be picked up without any new laws or intervention from Facebook. We need to analyse how it was missed and learn from it.


Ah I see, I did misunderstand.

I agree with your comments entirely.


And according to my kids ‘everyone’ at school uses VPNs.

The only people who don’t have access to these sites are the people who don’t want to see it.


AFAIK (and according to Wikipedia) the manifesto was only posted "minutes before the attacks began".

There wasn't any system for gun tracking that would let any agency detect he was amassing a cache. That is likely to change.


The Royal Commission of Inquiry is just beginning and will specifically focus on what the relevant agencies did, or failed to do.


One of the questions the public is asking them is why White Supremacy is not mentioned in 10 years of NZ SIS and GCSB reports. For readers unfamiliar with the agencies involved, the NZ SIS is NZ's domestic intelligence agency, whereas the GCSB is ostensibly focused on foreign intelligence.


It's almost like they give more of a shit about people posting things on the internet than some looney slaughtering 51 people.


- he posted on sites which were considered targets. - he posted a manifesto online (it's been censored in NZ) - he livestreamed the act (also censored) - his attack has been referenced by subsequent attacks (this was his goal) - he wants to use the courts as a platform (our media has voluntarily agreed to censor proceedings) - he said his rights are being infringed by not having phone and other communication (his "rights" need to be weighed against all the dead and injured) - other groups want to politicize and publicize his agenda (should they be censored?)

His efforts to use the web as a platform for inciting hate and further violence led to our government's response. If the US President incites hate and violence via Twitter we would be having the same discussion no?


The idea of censoring his manifesto is strange to me. You'd think people would just want to argue the points brought up in his manifesto on their merits and win mindshare that way.


We heard he cites Anders Breivik's manifesto in his own (i haven't read them) so the domino effect of self-radicalization is what concerns them i think. A bit like how we have laws against reporting on suicides cos they determined that suicides statistically increased when they were widely publicized in the past.


How well is that working against the flat-earth and anti-vax communities?


Well, a good chunk of the US anti-vax movement seems to be ultra-Orthodox Jews: https://www.nytimes.com/2019/05/14/nyregion/measles-vaccine-... Their religous beliefs don't permit them to have unfettered Internet access because it'd expose them to corrupting ideas (like, say, the idea that vaccines are safe).


And yet more than a good chunk of the anti-vax movement worldwide has unfettered Internet access. Ultra-orthodox Jews certainly aren't a factor here in New Zealand.


Yeah, it's almost like the Internet has no inherent effect one way or the other on the rise of anti-vax beliefs or something.


If the US President incites hate and violence via Twitter we would be having the same discussion no?

No. We might be having a conversation about the inappropriateness of such comments, but we have no laws against hate speech. In fact, a sitting president can call for imminent violence against a person or group (which is illegal) and the only recourse would be impeachment or waiting until the next election (he can be prosecuted after his removal though). So there wouldn't be much we could do if the US President did call for violence.


That looney was radicalised on the Internet, by a subset of Internet culture.


He explicitly said visiting Paris and Turkey radicalized him.


He also explicitly said he'd learned his radical ideas from 8chan in his final post there; and furthermore, there's good reason to believe his selection of travel destinations was already motivated by his far-right worldview of a 'Muslims-vs-the-West' conflict.

Lastly, it is a mistake to take a document intended as propaganda directly at face value; for instance, his claims of "ecofascism" on inspection seem to be more intentionally selected to inspire infighting between political left and right (one of his aims he claims to have regarding gun legislation) rather than motivated by any genuine concern for the environment; for example, his manifesto was devoid of environmental concerns unless we were to first concede a fascist obsession with racial birthrate disparites is an "ecological" concern.


According to his manifesto the largest factor was a visit to France, not online activity.


His manifesto also included memes and copy pasta.He listened to meme music as he drove to the mosque, he said "Subscribe to Pewdiepie" before he shot the first victim.

The fact that he was a member of an online community that considered his views normal is extremely significant.


Oh yeah? How about the fact that he drove a car, or ate meat or spoke english or traveled a lot?

The "online radicalization" thing is complete and total bullshit: it's an excuse to ramp up the totalitarian surveillance state in a more obvious way than the "war on terror" -now it's a war on people who read things on the internet. Magic internet forums might make me into a terrorist ... or a hacker; better put me in prison!


> Magic internet forums might make me into a terrorist

He found an echo chamber that reinforced his beliefs, which happened to be filled with internet edgelords that egged him on.


And you don’t think his manifesto is propaganda rather than a attempt at reliable narration?

Coming to “the truth” seeing the reality of the world at first hand is a much more stirring story than being radicalised by lies on the Internet. Plus it fits directly into all the other racist propaganda put out by white supremacists about the state of Europe and France in particular.


More internet censorship?


I guess they're not going to ban content that glorifies war, though? That would be a bridge too far.


Dont be mad, the military is paying quite a bit for advertisement.


Of course not. They are going to promote it. They are going to give the pro-war "authoritative sources" preferential treatment.

It's funny how so many of the pro-censorship comments here reference ISIS. These same people would defend the pro-war news companies which has spread false propaganda leading to illegal wars that killed millions of innocents in the past few decades.

If these companies are going to censor "extremism", then they should start with the "authoritative sources".


Everyone decried censorship, but always fails to provide an alternative to try and stop extremism online.


So "we must do something, this is something, we must do it"?

My lack of an alternative doesn't make your idea good.


If your bad idea is better than doing nothing and nobody can come up with something better, you could do it.

The problem is youtubes recommendation algorithm, which rides users ever deeper into weirder and more outrageous videos, if they are susceptible to it. It's a bit like balancing on a knifes edge, and that very system feeds directly into those generating revenue from it. It is a system that rewards cheap outrageous lies over intelligent and balanced truths.

In today's age where every truth will be declared to be just an opinion while every opinion will be presented as the truth, we don't need such a social experiment.


> If your bad idea is better than doing nothing and nobody can come up with something better, you could do it.

Where is the evidence that this is better than the status quo?


If your problem as a society is the spread of extremist content and ideology (in parts because social media algorithms reward everything that provokes reactions), than "doing things" to avoid specifically extremist content linked to terrorists might not be a bad start?

Isn't this the way the US government treats islamist terrorist propaganda too? I am not sure why we should treat rightwing terrorists different. Both are inhumane, cruel and undemocratic ideologies, that would end free speech as soon as they get in power.

Note that this is meant to target right wing extremism, not right wing ideas per se. I am all open for discussion, but when people start promoting violence, genocide and hate, maybe a society is better off not protecting them unconditionally.

Free speech is also a cultural issue, e.g. in Europe you could shock some people with explicit violence, while nobody cares about exposed female nipples. In the US it is exactly the other way round – they are extremely open to violence, while in other areas free speech doesn't seem to count as much.


> Isn't this the way the US government treats islamist terrorist propaganda too?

No.


Not the US government, but Twitter, Youtube and Facebook do have programs to block ISIS propaganda for example.


Everyone decried censorship, but always fails to provide an alternative to try and stop extremism online.

The same as always. Criticize it. Maintain our freedoms. Continue to live as free people. It's only when we sacrifice our freedoms for security that the terrorists win. Unfortunately, we have been doing exactly the wrong thing by dribs and drabs for going on 2/3rds of a century now.


Saudi Arabia considers athiesm to be extremist. Many people consider child mutilation ("gender surgery") to be extremism. Some people consider Zionism to be extremism. Some people consider border security to be extremism. Some people consider abortion restrictions to be extremism.

So now, what is extremism?


I rather live with the remote threat of extremism rather than live in a censored world.

Would you rather live in North Korea with no extremism?

You have a greater chance of getting struck by lightning than dying from extremism and yet you'd give up your right to free speech for some remote threat?

That kind of thinking is the cause of north korea and nazi germany. Authoritarians always use remote threats to justify taking your rights away.


>You have a better chance of getting struck by lightning than dying from extremism...

I know this is pedantic, but you got me curious so I went to check the numbers.

Odds of death in mass shooting (US only): 1 in 11,125

Odds of death by lightning strike (US only): 1 in 161,831

So it seems that it's actually a lot more likely for an American to die in a mass shooting than it is for us to die in lightning strikes.

Edit:

Sorry. The source was the National Safety Council, National Center for Health Statistics, and the Cato Institute.

https://www.businessinsider.com/mass-shooting-gun-statistics...


From the same source:

Odds of death by police: 1 in 7700

Odds of death by car ("any motor vehicle incident"): 1 in 315

I'd agree with basetop, I'd rather live in a world free of censorship and "full" of extremism.


These seem like very high odds.

Odds of death by car ("any motor vehicle incident"): 1 in 315

Per the source, "any motor vehicle incident" is actually 1 in 108.


They're lifetime odds, not yearly.

> 1 in 108

Oops you're right, I picked the wrong line :D


Not all mass shootings stem from extremism. Gang violence also falls under the same definition.

Also from the business insider:

"There is no broadly accepted definition of a mass shooting. The Gun Violence Archive defines a mass shooting as a single incident in which four or more people, not including the shooter, are "shot and/or killed" at "the same general time and location."

Another source suggests that death rate from "Islamic terrorism" in the US is somewhere around 1 in 3,500,000. If you presume other types of extremism are similar, there is still a significant difference from the mass shooting rate.

https://www.businessinsider.com/how-many-mass-shootings-in-a...

https://politicalscience.osu.edu/faculty/jmueller/since.html


> Another source suggests that death rate from "Islamic terrorism" in the US is somewhere around 1 in 3,500,000.

Yeah, that can't be right. Or, it can be, if one only takes into account years after 2001, when at least 3000 people were killed, or 1 in 100 000 US residents.


The 1 in 3,500,000 rate is for years 1975 to 2015 and includes the 9/11 attack. Since 2001, the rate is about 6 deaths per year for a total of 100 over 17 years. This gives a post 2001 rate of less than one in 50 million.

See page 4 of the previously referenced document.


Ah, so you're likely using the annual rate. Parent was using lifetime risk.


Very good point, I missed that


edit: I mixed up yearly statistics vs "lifetime likelihood". I'll leave what I wrote for posterity

If the US population is 328,000,000, and each of us has a "1:11,125 odds" of dying in a mass shooting, that would seem to indicate that there are 328m/11,125 = 29,483 "mass shooting deaths" in any given year (I'm happy to accept corrections on my math here, maybe I'm completely missing something). That's patently false, and quite a spurious definition of "mass shooting". My definition of "mass shooting" is an unprovoked attack for terroristic reasons. i.e. NOT a { jealous spouse/drug dealers/gang bangers }. I don't keep an active tally, but I would estimate the number on an "average" year to be about 50, double that during the year we had the vegas shooting. 50/year puts the odds at rougly 1:6,560,000, or about 40x less likely than getting killed by a lightning strike.


Gang violence also falls under the same definition. Also from the business insider:

"There is no broadly accepted definition of a mass shooting. The Gun Violence Archive defines a mass shooting as a single incident in which four or more people, not including the shooter, are "shot and/or killed" at "the same general time and location."


getting struck by lightning != death by lightning strike


It's also produced Singapore.


I would argue that Singapore is an example of a benevolent dictatorship. The problem with all such, that was already noted by Ancient Greeks, is that you have no guarantees that the next dictator is going to be benevolent.


That's a pretty silly example. North Korea is North Korea precisely because of the extremism.


I don't understand that statement. I believe North Korea is a totalitarian state with concentrated power in few hands and no tolerance for dissidence. Are you saying they allow non-state actors to speak and take action freely to promote extreme ideologies there?


The poster probably means that NK is NK, because of its own extremist ideology. So the only way to really get rid of extremists seems to be to become more extreme than them.


If that's the meaning, then extremism led to the US, France, and a number of other countries.


I didn't say it was the only way of getting rid of extremism. I'd even argue that getting rid of extremism by installing a extremist regime makes as much sense as chopping a leg off because you want to get rid of a headache.

It is important to remember, that no nation no matter how great its tales of freedom or historic shame might be is immune against extremism.


North Korea became north korea because of censorship. It is the most censored country in the world. And the point is that there is not "extremism" in north korea because censorship allows only "authoritarianism". That's the world you want to live in?

I agree with you that censorship is extremism and it should be fought against.


Extremism to a Westerner perhaps. Certainly, the ideology behind North Korea would be censored in most of the proposals on this thread, but given that the government of north korea would not and it is the only government with any control over the country of North Korea, I don't understand by what metric you could possibly say it's extremism. Unless you're appealing to divine justice or something, which would be great, but I sincerely doubt it.


Okay, I'll bite. Here is an alternative. Make it illegal for any "news" agency to publicly name the group or individual that commits any "terrorist" act. Make it a multi-million dollar fine per instance. We need to stop giving these people their 15 minutes of fame or infamy. All coverage must be about the victims only.

We also need to collectively care about stopping extremism with other approaches. One thing that is that most people who commit these heinous acts have reached a point where their lives have to perceived value. They are then easily radicalized. (See blue collar America blaming all their jobs lost to automation on Mexicans) People who have money for their family and are generally happy don't go around killing for some cause. Addressing wealth inequality can help.

The challenge of supporting this kind of "good censorship" is always who will watch the watchers? What passes for terrorism in the US is different than in a country like Russia or North Korea or .... A lot of people (even in the US) are very careful about what they post on socials. What happens if people who attack a certain viewpoint or organization are suddenly declared "evil" and the cause of all our ills? Is someone on a bowling team with a closet Nazi or KKK member also guilty? You only need to look back at McCarthyism to see how easily "good" people can be weaponized into something terrible.


Great plan. All it takes is a government to designate a group as terrorists and any mention of them results in mega-fines.

nit- Among others, neither the 9/11 gang nor the Christchurch shooter were poor.


That is unconstitutional in most countries, because you have a right to a public trial. In the United States, it's especially illegal, since the press has a right to free reporting, including public trial.

The danger here is the government starting to weaponize this 'freedom', by designating mundane crimes as 'terrorism' and then holding secret trials. Given that this is precedented, and leads to significantly worse outcomes (including full-on genocide), it seems that it's probably better to just accept that there will be mass shootings. After all, even adding up all the mass shootings in the world, you will still not come close to the slaughter's perpetrated by governments holding secret trials of dissidents


What kind of alternative would you like to see? How do you propose to evaluate if it's preferable to censorship?


That's the thing, I can't recommend an alternative other than censorship. All I know is that violent extremist groups or identities like ISIS, white nationalism, and others can be linked in part to a good chunk of the mass shootings and other acts of mass violence that many parts of the world have experienced over the past 10-15 years.


Those are excellent points. The rise of extremism is very worrying. Have you considered the possibility that this is might not be unprecedented, and that censorship may have been tried previously?

It may be worth considering that it might be acceptable to not choose censorship as a policy. Some might opine that an inability to find an alternative should not be construed as support for censorship, and that this is merely the politicians syllogism at work.


I don't know that you can prevent hostility after combining large groups of people from different cultures. Maybe throttle the stream of immigrants from different cultures so people can assimilate?

Better than censorship.


What if that throttling involves a good percentage of them dying? Is that still better than censorship?


Preserve our freedom AND limit population growth? Sounds like a win win.


You can’t stop it.


The bigger problem is how selective is stopping the extremism in many venues. This is really worrying.

Here in Germany it's very popular to talk about, chastise and silence "Nazis" (be it real or assumed), which sort of makes sense knowing Germany's history. From my perspective -- of a Slav whose people were exterminated en masse by real Nazis.

At the same time various, frequently militant, extremists (e.g. anarchists, AntiFas, Islamists) are allowed to speak their mind, openly recruit and even spread calls to violence against their political opponents.

Mind you, Wahhabist groups were literally spreading their message just a few years ago (still in 2017) in front of shopping malls here in Berlin.


The "Lies" group giving out Qurans got banned a few years ago and groups on the left are still regularly hit with §129 trials and cant be employed in the public sector. The AFD is far from being silenced or even prosecuted. They are a far right party and that is pointed out. I dont think there is a miss characterization. And not even Pegida marches are prosecuted or banned. People on the right being prosecuted are the same old nazis as before. Groups and parties like the NPD, the Unsterbliche group or the DritteWeg who run around in uniforms with illegal torchlight marches shouting for a return of national socialism.


>Here in Germany it's very popular to talk about, chastise and silence "Nazis" (be it real or assumed), which sort of makes sense knowing Germany's history.

And yet it doesn't even prevent the core of the problem, which is that the supreme law of the land is easily abusable as a weapon once the Nazis get in power (and if the German economy tanks to the point where people can no longer buy bread for a day's work, which was true in the Weimar Republic, they will).

Anti-speech laws have never been about stopping Nazis. It's all about the feeling that they stop Nazis, which (especially in majoritarian-biased politics) is all that really matters.


> Mind you, Wahhabist groups were literally spreading their message just a few years ago (still in 2017) in front of shopping malls here in Berlin.

So do Neo-Nazis quite regularily. But the problem runs deeper. I find it acceptable for a society to say, that it doesn't want certain types of messages to be brought forward. E.g. if your political message is to basically disallow any other political message other than your own, why should society tolerate your message? Or if you are going in all inflamatory and start to divide people and spell out goals of genozide – why would any society that wants to remain civilized not in some form penalize that kind of behaviour?

As much as I am for free speech, censorship in any form (be it your collegues who stop interacting with you because of your shitty ideas) fulfilled certain societal functions and a lot of the change we saw in the recent years has also to do with the fact, that this censorship is not only gone, but the polar oposite: 30 years ago extreme opinions would have drowned out in the sea of mainstream opinions.

Today we have digital systems that penalize mainstream opinions and reward extreme opinions on the fringes. This leads to entirely different discourses and also ultimately to the need of more censorship to retain social stability.

The crucial question is how this censorship looks like once it comes (and it will).


Moderating discussion forums is not censorship.


It is, by definition, censorship. It's not _government_ censorship, but it's still censorship.


Yeah, but so what?

Companies need censorship to make money. They shouldn't have to lose money simply to satisfy your or my idea of not censoring.

Now the government? Yeah. That's a whole other issue.


Yeah, but so what?

Companies need to trample on your human rights to make money.

Now the government? Yeah. That's a whole other issue.

False. Freedom of speech outweighs property rights and the right of corporations' "Freedom of association."

https://en.wikipedia.org/wiki/Marsh_v._Alabama

https://www.youtube.com/watch?v=lBozijndSLc


It becomes a political issue when all companies start engaging with it, especially when they do so under external political pressure ("We really don't want to pass a law - how about you self-regulate?").

Consider the situation in Australia, where all ISPs acted in concert to block websites. It's technically not government censorship, but the effect is the same - so any utilitarian rationale behind restrictions on government censorship should apply here equally.

Or maybe this should just be considered a form of illegal cartel.


I'm going into nitpicking rabbit hole, but the definition of 'censorship' is government censorship. The word comes from Roman republic position of 'censor'.

"The censor was a magistrate in ancient Rome who was responsible for maintaining the census, supervising public morality, and overseeing certain aspects of the government's finances." https://en.wikipedia.org/wiki/Roman_censor

When its not government censorship it's called self-censorship or something entirely different.


Ironically, I've rarely seen a thread with so much downvoting as this one. Is trying to grey-out a post the same as wanting to censor it? I just went through and upvoted most of the posts that were grey.


I was thinking about it. Making up filters to filter other people expression, using rules that are not explicitly enumerated in Governing laws -- is selective censorship.

Making up speech/expression filters that are explicitly enumerated by governing laws, is censorship too -- but this one is not 'selective', instead it is mandated.

So the opposition that people have on these topics, is towards 'selective censorship', not the 'mandated' one.

The next argument for 'selective sensorship' -- is that a private business can make up their own rules.

My view is, yes, private business can -- but then, the content where selective-censorship was applied, cannot be available publicly without a fee.

So, if Facebook wants to do selective-censorship, then ok --

however, that filtered content should be available only to people who selected to participate in explicit business relationship with Facebook.


What constitutes ‘extremism’? It’s a term so vague as to be practically meaninglessness.


Extremism seemt to match up closely with an opinion that is opposite those in power.


And yet those in power are the ones who get to decide what the term means.


I'm going to take a swing at your argument from a different angle, so follow along for a second. How do you define an immigrant?

The reason why I bring this up is because I've often seen people embracing laws or actions targeting immigrants, but isn't this in effect the same slippery slope as attempting to define extremism when we talk about free speech? Especially as immigration law is often expanded over time, encroaching on the rights of 'citizen' (a term which is also eroded over time) as we define a secondary class of people of which your typical rights do not apply to.

If we want to play the definition game, the exact definition of the word 'immigrant' depends entirely on the ruling class, ergo makes laws affecting immigrants equally as likely to cause the slippery slope effect as laws that affect extremists.



[flagged]


It is a slippery slope. For all intents and purposes we're accelerating down the slope.

The Orwellian nightmare you talk about already exists in countries like China. The naive presumption that the West will always support free expression is baseless. It will not unless it's vigilantly defended by Westerners. And once given up, liberty is not easily regained (see Venezuela, Cuba, USSR, East Germany, etc).

It's also worth noting that whoever holds power and is defending status quo (however benevolent or malevolent) determines what constitutes extremism.


Exactly. If we start censoring now, where's the Schelling Fence [0] between this and censoring mainstream political opinions? What logic used to censor things here won't be used to censor more things later?

0: https://www.lesswrong.com/posts/Kbm6QnJv9dgWsPHQP/schelling-...


Exactly. If we start censoring now, where's the Schelling Fence [0] between this and censoring mainstream political opinions?

I already see active attempts to re-define and re-label bog-standard Republican positions as "Far Right" and "Fascist." I've already encountered many people who want to throw the fact that such positions were mainstream, "down the memory hole."

What logic used to censor things here won't be used to censor more things later?

The effort to censor more things has already been ongoing.


All Western countries have governments and are states based on the rule of law, and all place some limits on speech and expression. Even the US doesn't allow certain forms of expression, such as libel, slander, falsehoods, terroristic threats, etc. The West has therefore never supported free expression, just as it has never supported free markets.

We are already on, and have always been on, that slope. Yet not all Western countries are like China or North Korea, odd.

It's almost as if that slope isn't as slippery as some would have us believe.


The US allows just about all expressions without prior restraint. Even shouting fire in a theater is allowed speech until a court says otherwise -- after the fact and with due process.

In contrast, censorship is a kind of prior restraint that restricts speech in the absence of judicial review or due process contrary to the rule of law.


Yeah but it restricts speek on a media plattform. Believe it or not, in the past 100 years it was pretty much the norm, that certain contents would not be published e.g. in books or newspapers.

A lot of the stuff right wing extremists say on youtube today would have been censored on television in the past (depending on the country of course).


Ah, the "cant happen here" argument.


Before the Internet came around, it was actively happening "here". You couldn't shop your violent extremist video around to pre-Internet mass media outlets, or if you did, you would probably be referred to law enforcement.

Not all forms of this look like China, some look like the USA circa 1990.


Pre 1990 extremists would maintain mailing lists and send each other communications and videos.

You have no right to mass media outlets. Just as I have no right to barge into your home and force you to listen to my speech... you have no right to force the owner of a tv or radio station to broadcast your message. Your right to free speech is not a right to violently force others to deliver your message against their will... those other people have the right to free speech and liberty too.


> You have no right to mass media outlets. Just as I have no right to barge into your home and force you to listen to my speech... you have no right to force the owner of a tv or radio station to broadcast your message. Your right to free speech is not a right to violently force others to deliver your message against their will... those other people have the right to free speech and liberty too.

I agree. I'm just saying that suppression of violent extremist propaganda in mass commercial media isn't some kind of violation of some centuries-old tradition, or a slippery slope toward Chinese style information control, as some people frame it, because it was literally the state of affairs in the US prior to the Internet.


Nobody forces anyone to hear from anyone else on Facebook or Twitter. You can block people, you choose who you follow. That sort of thing is woven into the fabric of the medium itself. TV and radio was traditionally limited to broadcast companies because there was only so much spectrum to go around. The internet is a different thing, it's not exactly analogous to any medium to ever come before it.

I don't even know if you and I have the same definition of extremist. If you mean promoting literal violence or any other illegal activities, you're absolutely right. No one has a right to that kind of speech.


No, more the "isn't inevitable" argument implied by the slippery slope fallacy.

But that may be too subtle a distinction for a thread like this.


Parent didn't imply it was. Freedom is often lost incrementally. Framing that basic fact of history as a logically fallacy is unwise.


>Parent didn't imply it was.

Yes, they did.

The purpose of suggesting that "extremism" as a term is "so vague as to be meaningless" is to imply precisely that. It's a common enough rhetorical tactic that it can be taken for granted in any thread where speech or censorship (particularly of what is considered right or far-right politics) is the subject.

Extremism in context has a commonly understood definition, and claiming otherwise is not a convincing argument.


No, parent pointed out what I repeated, that history shows that freedoms are lost incrementally. You are the only one talking about inevitability.... which is par considering you also think it's totally obvious what some word that by it's very nature is inherently undefined, means.


My point was not that ‘extremism’ does not have a definition; my point was that what content falls under that definition is very much subjective. I could have phrased my original comment better.

Obviously, the drafters of this agreement had certain types of content in mind. But those people are not the ones who will be implementing the policy.

The types of content targeted by this kind of policy depend very much on who is making the decisions.


Here are some examples of how Western countries with democratic governments and rule of law actually utilize those laws that place limits on speech and expression:

https://theintercept.com/2017/08/29/in-europe-hate-speech-la...


Bless your heart.

It was actually a trick question: you cannot tell me what “violent extremism” means in this context, because you don’t know. You don’t know because you’re not the one who gets to decide.

The only definition that matters is the one held by whoever will be deciding what gets censored and what doesn’t.

How might your political adversaries choose to apply a filter on “violent extremism”? I could easily see claims arising that abortions are both violent and extreme, and that pro-choice material is therefore “violent extremist” content. That’s probably not what the people behind this agreement had in mind, but at the end of the day, that doesn’t make much difference.


[flagged]


>I thought I was going to be able to have a rational, intellectually satisfying discussion but

But instead you lashed out irrationally, posted a definition that went against your point, and then dismissed the entire thread because you realized you weren't supporting your argument.


The powers that be. The shadowy cabal. The unseen hand which controls all aspects of society with absolute and arbitrary power.

Actually, this does seem to be the case with a somewhat reduced context of just media/social-media/tech in 2019. Journalist collusion is well documented. Big company manipulation of local journalists is documented. We have seen indications of CEOs of different companies cooperating to suppress politics unacceptable to them, even to the point of implementing censorship and demonetization within their platforms, and outside their own companies and spanning multiple platforms.

no one is capable of defending free speech without resorting to slippery slopes and absolutism

That's begging the question. Free Speech is inherently an (almost) absolutist position. There are other rights which take precedence, but to be valid, no one can be the arbiter of the philosophical or political positions taken -- only of incitement to specific illegal actions. Otherwise the arbiters on speech exist, and there is no Free Speech.

In reality there is no need to defend Free Speech. Society isn't on the verge of collapse here. Maintaining Free Speech won't lead to an escalation of extremism. Only increased suppression will do that. What's really happening, is that big companies and governments are engaging in a power grab which weakens Free Speech.


Free speech is not an absolutist position and it does not exist. There is no single country on this earth, past or present, which has held an absolutist free speech position. Yes, that includes America.

It may be enshrined in our constitution, but that has never stopped us from selectively applying constitutional rights to different classes of citizens or people we consider to be non-people. The real problem is pretending that an absolutist position exists and acting like we're becoming more free when the reality is that violent bigotry is rising while we complain about the imaginary slippery slope into fascism whenever someone decides to take action against them.


> It may be enshrined in our constitution, but that has never stopped us from selectively applying constitutional rights to different classes of citizens or people we consider to be non-people.

So, because some people were denied their rights in the past, it proves that right doesn't exist and shouldn't be protected?


So, because some people were denied their rights in the past, it proves that right doesn't exist and shouldn't be protected?

"The arc of the moral universe is long, but it bends toward justice." --MLK

I would agree we don't do this: "Selectively applying constitutional rights to different classes of citizens or people we consider to be non-people."

Instead of forcing a power differential on people we don't like by limiting their access to things on the Internet, we need to convince them. It may take time. But that is the way of just, enlightened people. Using coercion against people by taking away viral-dissemination and discovery from them isn't any different than taking away books, taking away printing presses, or not allowing them into good schools.


Nothing personal, but that's because we're old enough to have seen everyone slipping down the slope.


> Although this is Hacker News, and I know it's futile here to suggest that standing in the way of bigots and extremists is anything but a slippery slope towards an Orwellian dystopian nightmare of fascism and thought-police and boots crushing our heads forever

Of course it's futile, because you're not really "standing in the way bigots and extremists" but deciding for other people what ideas they are allowed to hear. I don't like extremist speech, anymore than I like obscene, purile, blasphemous, or false speech.

I do enjoy using platforms which curate content so that it fits within these preferences, but I reserve the right to listen to other peoples' ideas and make up my own mind. If I then proceed to do something unlawful as result of my judgement, then that is my own fault and I will be held liable for it.


> If I then proceed to do something unlawful as result of my judgement, then that is my own fault and I will be held liable for it.

I am A-OK with that stance right up until it infringes on the rights of someone else. Taking illegal drugs in your own home vs taking them and then driving are both your "fault", sure, but one of them runs the risk of hurting others. But then how do you legislate for avoiding the second without infringing the first?


Is it possible that perhaps this difference could be split by legislating against harming others? This also has the benefit of not requiring lesiglative bodies to spell out every circumstance that could be involved.

It sounds like you might be in full agreement with the person you have quoted.


> Is it possible that perhaps this difference could be split by legislating against harming others?

It could. But I'd put (at least) extremist/hate speech into the category of "harming others" and that puts me on the side of "censorship" which, I think, is in disagreement with them (and, it seems, most of the people in this thread.)


I think that depends on how one defines harm. I think we can all agree that bodily assault, destruction of property, and similar are all clear harms. So is a loss of human rights, freedoms, and liberties.

Personally, I find myself deeply skeptical of unprovable, unverifiable, and ultimately vague notions of harm. If you can provide a clear demonstration that someone engaging in hate speech is meaningfully the same as physical violence comitted against a person, I am happy to reassess this position. To be clear, I expect clear criterion for what is hate speech (no "community standards" qualifiers) and proof of consequences as clear as those of other clear harms. Until then, I'm reluctant to infringe on general freedoms in the name of something that does not appear to be a clear harm.

I hate doing this, but I'm also someone who regularly sees hate speech directed against my ethnic group. I am well familiar with the emotional consequences of being on the receiving end of what some might call hate speech.


Start a slippery slope argument? We're way beyond started. We already have people actively seeking to tie the most mild institutions around to the most vile opinions around any way they possibly can, so that they can ban and silence anyone they disagree with. It's already happening, and only a sincere commitment to freedom of expression for all, even those we disagree with, can stop it.

Have you read the Christchurch manifesto? I have. One of his goals was accelerationism, to actively get more people banned from communication by taking extreme actions, so that they get angrier and driven to more violent actions themselves. And we're playing right into the hands of him and those who think like him by banning and censoring everything in sight.


Why would you believe anything in that manifesto?


What do you mean by "believe"? He said that one of his goals was to accelerate the conflict between two sides of the cultural divide. That's a pretty common goal of terrorism. Am I supposed to not believe him, and think he did it for some other reason?

I believe he wrote the truth about what he thinks and why he did what he did. I don't see why he would lie about that. Whether you think any of his points have any validity or agree with any of them is a whole different ballgame.

I also believe that, in a society that aspires to practice freedom of speech and freedom of expression, it is an essential skill to be able to read something you may not agree with, written by someone who took actions that you oppose, and objectively evaluate the content.

Not everyone needs to or can, but some people had better do it, if we are to have any hope of rising above the hate and division associated with these sorts of acts.


I'm originally from a country where laws against "extremism" are actively used to target political opposition for over a decade.

Go ahead, tell me again how slippery slope is a fallacy.


Extremist is not well-defined, despite how much you want it to be. People who thought women had the right to vote a hundred years ago were extremist.


> So we can see that in the context of the article, and the call to action mentioned, "extremism" is not vague, nor meaningless,

The fact you had to provide context for this one case makes your whole point moot, and you don't even realize it.

Sure it makes sense in this case. But a law that just says extremism (without context) is bad, can and will be abused down the line to fit the narrative of the powers that be.


in some online circles, there was a call to ban 8chan because the terrorist posted there. Do you support banning that website because terrorists used it? Then do you support banning facebook for the same reason? Where does it stop? Or do you recommend just taking down individual posts ?



Well, broadcasting your massacre of 20 people online would be one kind of 'extremism'.

So there's that.

Surely it can get tricky (and it will) but we have to do something. This is not new, we always have. Every nation has some kind of hate law, applied one way or another.


I would be interested to know what the technical difficulties are in scrubbing a banned video, and all derivatives, from a Facebook.

Are there practical AI/video analysis techniques to detect that a video contains a fragment of another video? Surely.


There are, YouTube has a fairly effective algorithm that is used primarily to remove copyright-infringing content:

https://en.wikipedia.org/wiki/Content_ID_(algorithm)


Aren't they missing the point here? The problem isn't that this guy streamed what he was doing on facebook, it's the fact that he did it in the first place?

As these large hosts move more and more away from mere platforms to content curators it does make a lot of sense that they'd also be more responsible for what they curate, but at the same time, it seems like this responsibility will ultimately leak back into the parts of these services that are really just platforms and ultimately to those that don't curate content at all.


> The problem isn't that this guy streamed what he was doing on facebook, it's the fact that he did it in the first place?

The Las Vegas shooter we're told had no motive, but there's a public sense that each massacre is trying to "out do" a previous kill count. With Facebook being the go to place when one occurs, it's instant fame and notoriety for the perpetrator. For this reason our government and media took the immediate response of "de-naming" the killer, but with instant global online platforms this is after the horse has bolted. This approach of involving platforms directly is to neuter the draw of instant notoriety. To remove fame (or widespread publicity if there is an agenda) from being another contributing factor.


A big contributing factor to him doing it in the first place was him marinating in online murder advocacy groups.


Yeah, this is not a slippery slope at all, no sir. And the real extremists will just use Tor.


The police in Germany already proposed to ban Tor as its "really not needed in western democracies".


Laying down the groundwork for the Fourth Reich, I see. 30 years down the road someone gets elected who really doesn't want to go when the term is up. If things are well bolted down by then, they could stay indefinitely by simply sending people to jail for "extremism". It's as if history doesn't teach people anything at all, even recent history of less than 100 years ago.


The approach seems fairly reasonable, it sounds like it's limited to explicitly violent extremist content, and it's being done using pension funds of various governments in a activist investment manner to try to bring about changes


Historically, a combined effort by corporate and government interests to take away your rights was called fascism. But I guess if it is the far left doing it, it's okay.


Not a single one of those signatories is far left. The overwhelming majority are centre right.

And historically, efforts to curb uncritical airtime for violent and dangerous people have been fairly common (albeit not entirely uncontroversial). Neither Ireland nor the UK were ever far left or fascist but RTÉ and the BBC both heavily restricted interviews with loyalist and republican paramilitaries during the Troubles in Northern Ireland.


Who is far left in this? Jacinda Ardern represents a centre-left coalition government.


Some people like to use the term "far-left" to mean "far-left of me", without acknowledging that they're nowhere near the center themselves.


In fairness, people do the same thing with "far right". I've met some people who would likely characterize President Obama as far right.

This particular pathology seems to arise from the idea that "far $TYPE" is inherently de-legitmizing. And once you've de-legitimized someone or some position, you don't actually have to take them seriously, so...


What's the difference between left and right?

The old definition was about republic vs monarchy. Now everyone keeps using those labels. Conservative? Right wing. Socially conservative? Right wing. Economically conservative. Left wing. Weird!

Authoritative government that's socially conservative; Venezuela. Wait what? Thought they were labelled far left?

Left and right labels are pointless, refer to the parties for what they represent; social conservatives; authoritarian; etc

https://en.m.wikipedia.org/wiki/Left–right_political_spectru...


Today's public uses political terms very incorrectly and I think lots of us are confused because it.


I find it problematic that there have been oodles of very classical kinds of 'terrorism' and 'extremism' on Social Media since the start.

ISIS has been recruiting with absolutely brutal kind of stuff on Twitter, etc..

But now we have this nutbar thing in New Zealand and it's a 'global action'?

Aside from the complications mentioned in some other comments ...

... the Jacinda / Trudeau / Macron triumvirate I think were looking in the wrong places.

So it's probably good that we're taking action, and just beyond repulsive that that some massacre was broadcast live on Facebook, but I hope we accomplish do this without too many existential issues.


Make internet more expensive therefore limiting its impact among the stupidest members of society - the working class.


This is horrifying


Why? I'm honestly surprised by the brevity of your comment and by how strong your opinion apparently is in one direction.


It is organized censorship across a large part of media people use to communicate today. As a cooperation between governments and those companies.

It really doesnt need more then OPs comment, this is abhorrent. Its quite a big step towards an authoritarian society and the transformation into dictatorships.

In hindsight the generation of the anti-authoritarian left growing up after the fall of the USSR got rather careless with authoritarian tendencies on the left. Lessons learned my ass, here we go again.


Imagine the rules that are codified because of this being used by people to ideologically disagree with. Generally laws are very _very_ hard to get rid of.


Live streaming murder is horrifying.


I thought that (as an example) the pioneering video recording of the Vietnam war was essentially in shaping the US American public opinion on the war matter. I could be wrong, but it wasn't seen as something horrifying back then.

It's of course the actual murder which is horrifying.

The problem isn't the documentation, but how we as a global society chooses to work with those documents.

I'm not particularly deep into the issue, but I feel there must be something between glorifying it in some engagement-metric heavy filter bubble and making it an agenda to purge whole vaguely defined categories of content from the internet.


> an agenda to purge whole vaguely defined categories of content from the internet.

But that's not the agenda proposed at all. The agenda is to prevent violent extremism and terrorism promoting content from spreading on social media, not stopping war reporting.


[flagged]


Not at all. What made you think that?

The documentation itself just isn't the issue. I in fact wanted to highlight that context is a key factor by writing:

> The problem isn't the documentation, but how we as a global society chooses to work with those documents.


Being completely disconnected from the killings you sponsor with your taxes is horrifying.


I agree, but that's not really relevant to the discussion about extremism and online censorship.

Also I'm Irish, living in Ireland. Apart from maybe funding a health service that's not fit for purpose, it's a serious stretch to say that my taxes are sponsoring the killings of anyone.


It is, because without the "Collateral Damage" video, most people in the US wouldn't have been exposed to the horrors of what their tax dollars were doing at that time.


[removed]


"Never let a good crisis go to waste"


<This comment has been removed due to it's violent nature>


The Aristocrats!


[flagged]


The US would be a laggard here regardless, given its enshrinement of free speech


The US has plenty of laws against the use of speech to promote violence. The Netherlands has, in practice, some of the most expansive free speech rights of any corner of the planet, yet they had no problem signing on (to this non-binding agreement).


AFAIK courts have established precedents that it has to be "threat of imminent harm" - so promoting violence is OK ("kill all XXX") but shooting "fire" in a theater isn't.

Streaming a terrorist attack doesn't seem to result in a "threat of imminent harm" I think - if anything, it could prevent harm (because the viewers could report the violence and police could prevent further violence).


> but shooting "fire" in a theater isn't

There are legally recognized limits in the US, but that particular example isn't a good one.

(https://www.theatlantic.com/national/archive/2012/11/its-tim...)

> But those who quote Holmes might want to actually read the case where the phrase originated before using it as their main defense. If they did, they'd realize it was never binding law, and the underlying case, U.S. v. Schenck, is not only one of the most odious free speech decisions in the Court's history, but was overturned over 40 years ago.


"The US has plenty of laws against the use of speech to promote violence."

What laws are you talking about?


An example is the law against distributing information on making bombs/weapons with the intent of promoting/enabling violence.

https://www.law.cornell.edu/uscode/text/18/842


In these sorts of discussions I think it's very important to quote the relevant parts of such a large, dense document when linking to it.

US Code, Title 18, §842(p)(2)

(A)

> to teach or demonstrate the making or use of ... or to distribute ... information pertaining to ... the manufacture or use of an explosive, destructive device, or weapon of mass destruction

> ... with the intent that the teaching, demonstration, or information be used for, or in furtherance of, an activity that constitutes a Federal crime of violence

(B)

> knowing that such person intends to use the teaching, demonstration, or information for, or in furtherance of, an activity that constitutes a Federal crime of violence

Note that this very narrowly applies only when intent can be demonstrated, at which point you are arguably an active participant in whatever crime is being committed. In my opinion, that is quite different from your earlier claim.

> The US has plenty of laws against the use of speech to promote violence.

This isn't regulating mere promotion, it's regulating a form of active participation. The Christchurch Call agreement isn't even remotely similar.


Bomb making instructions are generally protected speech.

Distributing such instructions with the intent to further a crime makes the speech unprotected. But, each element (e.g, knowingly intending to further a crime, and so on) has to be proven in a court of law to strip the speech of its protected status.

From the statute you linked: "...with the intent that the teaching, demonstration, or information be used for, or in furtherance of, an activity that constitutes a Federal crime of violence;"

edit: typos


You should be aware that just because something is in the US Code, doesn't mean that it's actually enforceable. For example, there's a law that bans burning the American flag:

https://www.law.cornell.edu/uscode/text/18/700

However, this act was explicitly ruled to be protected political speech under the First Amendment:

https://en.wikipedia.org/wiki/United_States_v._Eichman

In a similar vein, the law you've linked to would still be subject to the "imminent lawless action" analysis, were someone to be charged under it.


Perhaps. Unfortunately absolute enshrinement of anything and the inability to critically evaluate when to allow for common-sense deviation from such enshrinement is a bit of a progress-stopper.


The problem with common-sense as a guide for law is that it changes dramatically over time, and often over short periods. It also remains to be seen whether or not this will result in progress in the long term.

It's fortunate that we've been in one of the longest periods of relative peace in post dark age Western European history, but I don't generally count on that trend continuing forever. Making sure a government has significant enshrined limitations on its own power is important, not in the good times, but in the bad.


That's the point of inalienable rights. You cant "progress" them away.


> common-sense deviation

Common-sense deviations, a.k.a. an exceptions to the rules are often ill-conceived acts of emotional stupidity that don't age well. If it was so obvious, why wasn't it part of the rule in the first place?


Same argument applies to those rules or enshrined values. Times change.


[flagged]


Huh? The US has been very aggressive about shutting down websites that spread Al-Qaeda and ISIS propaganda on most websites. ISIS basically caught everyone with their pants down spreading their stuff on social media, but once people started noticing it, it was purged rapidly.

It's not my view, but you could make the argument than the original sin was censoring Al-Qaeda content post-9/11. Obviously, nobody complained. Now the same tactics are being turned against the far-right and people have discovered a new love for free speech.


Yeah, maybe I wasn't paying attention, but I didn't see any people protecting ISIS's free speech. Now that this censorship is targeting other hate speech, suddenly everyone is shouting "free speech". The fact is, most people calling for free speech don't truly believe in it, they just want to be able to spout their own version of hate.


FWIW, there were a small group of people raising concerns about how the government went about shutting down "terrorist" websites after 9/11 but it largely fell on deaf ears. These people have been ideologically consistent, but they're outside of the political mainstream.


Also consider the fact that even if you do support free speech speaking out against censorship of a terrorist organization is only going to lead to blowback because you are supporting the enemy.


First they came for al Qaeda content,

Then they came for ISIS comtent,

Then...


Basically this. It was Pandora’s Box.


We're the same people that think PayPal shouldn't be allowed to seize your funds for writing "ISIS Beer Fund" on a $40 toss over to your mate...


> "Huh? The US has been very aggressive about shutting down websites that spread Al-Qaeda and ISIS propaganda on most websites"

You mean like archive.org? That's the website I've seen hosting all the ISIS shit.


True, there's a few other "ISIS archives" that are used largely by intelligence analysts, journalists, historians, and academics for study. But I don't think NZ is saying "Hey journalists and academics, don't study a dump of 8chan!"


What NZ has been saying is that there should be no way for an ordinary, unaccredited member of the public to view any archived copies of any of the Christchurch killer's propaganda anywhere online, something normal internet users apparently can do with ISIS propaganda via archive.org. They've been quite consistent about this since day zero, from what I can tell.


I wouldn't be so sure, following the aftermath of Christchurch my network (AU) and several others have not unblocked the Chan's. I don't believe they are planning to either.


Can you clarify if there are actual legal restrictions on that content in place, or is this still unilateral ISP action?


There are no current legal restrictions as I understand, it is the decision of my provider to continue restricting this access. The restrictions are not unilateral, the largest ISP has released the restrictions.

I will be changing providers as soon as my contract allows.


No, they're just trying to be the sole group who decides who is part of those two groups.


What do you mean? I was under the impression that ISIS was extremely censored online, particularly by the companies that signed this. Is that not the case? On the otherhand, I see white supremacist propganda basically daily at this point.


A couple of years ago, Cloudflare refused to remove access to ISIS websites, and said it's not the job of ISPs or themselves to police content.

https://www.ibtimes.co.uk/anonymous-opisis-cloudflare-refuse...


The policy of one internet infrastructure firm doesn't validate the original claim of ISIS' online activity being 'ignored for well over a decade.'


It does imply they take it less seriously than stormfront, though.


Do you mean Cloudfront? I suppose so, but I don't know enough about the company to reach a firm conclusion.


No, stormfront was a ... racially charged... website that Cloudflare actually did refuse to host.


I meant to say Cloudflare, sorry. I know what Stormfront is.


But you already censor things much less damaging than the live streaming of a terror attack. Instead of creating a beach head on freedoms which are damaging you should look at why the US rates do low on freedom, civil liberties and democracy and why most of the signatories of this call rate so highly and try and inact positive improvements.


The first amendment is against government censorship, not private companies providing platforms for free speech on their private networks.


[flagged]


This guy was a socialist who literally killed Nazis. Please, spare a couple of minutes, and read what he had to say, and note that this was originally written in 1945.

https://www.orwellfoundation.com/the-orwell-foundation/orwel...


See how you setup the perfect straw-man here - "If you aren't on the side of our perfect, morality infused, turbo-boosted censorship, you must sympathize with Nazis!".

The only time free speech matters is when the nature of speech is controversial! "I disapprove of what you say, but I will defend to the death your right to say it." is a more accurate summation of the U.S. govt., and the history behind the first amendment.

Also worth noting that an actual incitement of violence is expressly illegal under U.S. law, and the ChristChurch shooter's manifesto et.al. is probably already illegal, without requiring a new speech police


Incitement of violence is actually mostly legal under U.S law except when it calls for imminent action that is likely to happen, which is a surprisingly high bar to meet. See Brandenburg v. Ohio, 395 U.S. 444 (1969) [0]. Take note that this ruling was in defense of the KKK, but that it was later cited to protect the speech of the former NAACP Leader, Charles Evers [1].

>The conditions that must be met to impose criminal liability for speech that incites others to illegal actions are imminent harm, a likelihood that the incited illegal action will occur, and an intent by the speaker to cause imminent illegal actions. This precedent remains the principal standard in this area of First Amendment law, since the Supreme Court has not revisited it.

[0] https://supreme.justia.com/cases/federal/us/395/444/

[1] https://www.aclu.org/blog/free-speech/free-speech-can-be-mes...


Also worth noting that an actual incitement of violence is expressly illegal under U.S. law, and the ChristChurch shooter's manifesto et.al. is probably already illegal, without requiring a new speech police

Your understanding of the law is wrong. Incitement is assessed (in federal law) by the standard set in Brandenburg v. Ohio which requires both the intention to produce imminent lawless action and the likelihood that it will be produced.

https://www.law.cornell.edu/wex/brandenburg_test

Let us consider a document whose entire text is "Reader, please commit a crime as soon as humanly possible." It is targeted (to the reader), specifically encourages lawless behavior, and has a sense of immediacy.

But does it produce imminent lawlessness? If I say those words to you in person, the time between my utterance and your hearing of it is negligible. If you find me sufficiently inspiring or intimidating you may be moved to act upon my exhortation. But when you read a document, it might have been written 5 minutes ago or 5000 years ago. It may have been produced near to you or on the other side of the world. You may know and care who the writer is, or have no clue. Conversely, as the writer of a document my ability to predict when and where it will be read, and by whom, is similarly limited. Thus, a document by itself lacks temporal, spatial, or social proximity compared to an interpersonal interaction - making imminence virtually impossible to prove.

Likelihood of producing action is very subjective (and thereby also hard to prove). In this case I have offered no incentives or specificity to my exhortations, so you probably feel little motivation to select and carry out a crime and put yourself in legal jeopardy. If you were to cite this document as an exculpatory factor few people would take you seriously. Even if I offered an elaborate rationale and specific directions for committing a crime, when they're in documentary form I don't really know who will read it and have even less knowledge of how readers will react to it; it's very hard to say how anyone could assert a definite probability of 0.5 or greater that the crime will be carried out within any given period.

tl;dr it's almost impossible to prove the US standard of incitement outside of a very narrow range of circumstances, so making the 'incitement is already illegal' argument is the legal equivalent of sweeping dirt underneath a rug and then forgetting about it.


It's Christchurch, not ChristChurch. I wasn't going to say anything but you've done it twice so I assume it's not a typo.


Personally, I'm in favour of christ_church.


[flagged]


> When the discussion comes to any other topic such as police brutality, racism, discrimination etc. these voices are conspicuously absent.

Patently untrue, and a baseless accusation. If anything, the recent spate in enforcing compulsory body-cams for cops, and livestreams of actual police brutality have faced ZERO calls for censorship, especially from free speech advocates.

> things are illegal but let's not make any laws or have any clear legal mechanisms to effectively enforcing this

What does this even mean? If something is illegal, those with standing can obviously enforce the law.


Digging up the profoundly dirty history and reality of the US is unfortunately not taken very seriously, and actions by the police towards minorities etc are excused as justified by the same people that would argue against government tyranny.

It's easier to claim that the US is an absolute bastion of free speech when you can conveniently hand wave all of the instances where it is not. Especially when it never actually affects you. It came to absolutely no surprise when i saw almost zero reaction from the 'free speech' crowd when a woman was prosecuted for laughing at Sessions.


The case you cite [1] should only strengthen the argument for free speech.

The first amendment is likely what saved the lady from a frivolous lawsuit/censorship from those who abuse the power to regulate other's speech.

What you want is that power to reside with a cabal that aligns with your politics, you want a thought-police you approve of, all while citing a case where someone was literally sued for a thought-crime. I hope you see the irony, and that power often shifts with wild swings. Someday, you might find yourself at the wrong end of a thought-policing policy you shilled for, because those in power aren't aligned. Hence the case for constitutional free speech protections.

[1]: https://www.npr.org/sections/thetwo-way/2017/11/08/562823691...


No, I think the US administration, Trump in particular, sympathise with actual nazis - we saw this in Trump's reaction to the nazis marching in Charlottesville - "very fine people" he said


Utterly nonsensical. You don't need the internet to be radicalized, people have been willing to kill in the name of their convictions for as long as humans have existed. I predict even if somehow every white supremacist were booted off the internet, no lives at all would be saved, as they just don't need the internet to kill people, or to learn to hate.

This is purely giving up rights for the sake of security theatre.


You don't accept that the internet has enabled people with obscure and extreme views to better find each other, legitimize each other, spread their views and recruit others to their causes?


better find each other, legitimize each other, spread their views and recruit others to their causes?

It's allowed everyone to do that. That's why we have more acrimony today. Allow a level playing field, and the violent stupid losers will lose. Start taking away people's rights, and you've only given those toxic people a pretext. (Which is exactly what the Christchurch shooter was trying to do.)


Are the stupid flat-earthers and anti-vaxxers losing? No, they're doing much better than they ever were before, thanks to Internet platforms.


Are the stupid flat-earthers and anti-vaxxers losing? No, they're doing much better than they ever were before, thanks to Internet platforms.

Yes, relatively speaking. Everyone is more organized and connected. Everyone has more media creation capability. Everyone is more deeply ensconced in like-minded groups and more deeply taken by groupthink. This is no different than most everyone in the industrialized world having access to refrigerators, washing machines, and antibiotics. (Note that we aren't as acrimonious about the misuse of that last one, even though it's arguably just as bad or worse as vaccine shenanigans in the long term. Could that be due to the perpetrators being of high socioeconomic level?)

What you want to do is to single out a segment of the population, give up on them, and withhold the benefits of progress from them. That's a big red flag right there. We shouldn't have a society where the majority gets to arbitrarily suppress the minority. A part of the point of the US constitution is that the minority is also protected from the majority.

If you are really for justice and equality, then don't ask for the power to create second class citizens.


> Allow a level playing field,

Say by allowing a clearly incompetent and deranged lunatic the same airtime as his opponent? Or by giving a racist grifter airtime on a flagship political program?

> and the violent stupid losers will lose.

Alas, Trump won. Farage won. Bolsonaro won. Duterte won. The violent stupid losers keep winning despite their views getting wide public airings.


No, I don't. Neo-nazis have always existed, have always killed people, and have always been very few. Same for any other group you care to name. Violence overall is on a down trend.


Please provide evidence of your assumptions to back up your very strongly stated assertion.


Why should the web be treated any different to radio or TV?

A cable news channel doesn't get carte blanche to broadcast uncensored beheading or mass shooting videos at any hour of the day so why should a website not be obliged to take steps to curb the exact same thing?


Why should the likes of Fox News be exempt


I don't think they are. Fox News as slanted as it is, doesn't endorse terrorism or anything along these lines the way some internet communities or message boards do.

There's really no comparison between even the worst tabloid newspaper and some of the stuff that festers on the internet.


> doesn't endorse terrorism

The use propaganda to make people support the war effort against the Middle East. Ask people in the Middle East who the terrorists are.


Every mainstream outlet promotes wars in the middle East.

The only individual who is saying to not do that is... Tucker Carlson, who is on Fox.


> The use propaganda to make people support the war effort against the Middle East.

Didn't realize Democratic congressmen/women were taking orders from Fox News.


Considering that religious speech and the guiding religious books are often completely intolerant of all other views of the world and can thus be considered extremist, it would appear that such an agreement could ironically cause the censorship of the very religion that was brutally targeted in these attacks. Or perhaps they have specific censorship goals in mind? Think of the result in Alabama just yesterday, it is abundantly clear that Christianity as an ideology causes real tangible harm to women and ought to be completely scrubbed from social media.


I hope one day we will be able to condemn these politicians, bureaucrats and big corps for their crimes against free speech, just like we did with the nazis when they tried to subvert Europe. We need serious laws with draconian punishments to protect our rights, what we have now is insufficient.


It'd be great if people who want to explore this as a free speech issue would engage with the question of what happens to the free speech (and other) rights of people who are killed by extremists, and whether they are more or less important than the rights of people who advocate such killings.


The comparison isn't between 40 people's right to live vs every terrorist's right to free speech. It's between some probabalistic chance that 40 people's right to live isn't violated in the future due to these actions, versus some probabalistic chance of millions of innocent people being silenced and having their privacy violated in various ways due to the collateral damage from this kind of policy making for the foreseeable future. This shit never gets revoked once it's in place.

History is wrought with examples of people's free speech being violated. How many examples do we really have of when someone's free speech was successfully violated to protect proportionally more important rights?

The entire western world is slowly giving up every single ounce of privacy and freedom, in exchange, and for what? ISIS is finished, the rate of Muslim terrorist attacks seems to be falling off pretty fast, and the swell of fascist sentiment will slowly wind down too once the factors that triggered it are no longer present. This isn't some new concept that's never happened before. And in 10 or 15 years are we going to be happy with the state of government control in countries like Australia, NZ, UK etc given what we've got out of it?


There's a lot of counterfactuals here, both numerical and historical - for example, your suggestion that policies put in place to deal with a problem never being revoked which is simply not supported by fact. As your whole post is dedicated to invalidating the question I posed I hope you'll excuse me for not spending an hour on a point-by-point refutation of your numerous and very broad claims.


Those killed by extremists have a right more fundamental than free speech violated -- their liberty. However, free speech did not cause their lives to be ended -- another individual's choices did. We have a system to deal with this. It's called courts, the justice system, and prison. It's worked for thousands of years to produce reasonable societies.


Sorry for not responding earlier. I largely agree, but I think we are at a point where we need to weigh the actual costs that are involved and assess how well our approach is working - in my view, increasingly poorly.


> but I think we are at a point where we need to weigh the actual costs that are involved and assess how well our approach is working - in my view, increasingly poorly.

Luckily, we don't need to consult your 'view' to see how these systems operate. Given that violent crime is at an all time low, especially when considering the entirety of history, I think it's safe to say that if we consult reality, things are for the most part working just fine.


Free speech is just as much about those on the receiving end as on the producing. And, once the information passed into other hands your dealing with producing end again.

Your thesis is a non sequitur as well; people sharing and receiving the information are not necessarily advocating the killings. In fact, just like with War footage from the past, they could be using it to the opposite effect.


You're refuting claims I'm not making, while avoiding the question I am asking. As an extremism researcher I'm well aware of such complexities, thanks.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: