Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

> It's not misuse if it's common usage, it's simply usage you disagree with.

Your or my personal agreement on what the definitions should be are not the problem. The meaning has changed for only some users of the words, resulting in widespread misunderstanding, I would argue resulting in further (and completely unnecessary) political polarization. Words not having shared meanings seems sub-optimal to me.

> Obviously when people say "Nazi" they don't mean "members of a 1930s political party," they're making a statement about something else

But no one knows what statement they're making because we no longer have a shared dictionary, so people resort to their imaginations.

> If you find the usage of these words confusing that's on you, not the words or the people using them.

This suggest the problem is my lack of ability to understand, when the problem is there is no longer a common definition. There are certainly multiple popular definitions, according to which political ideology you subscribe to.

> instead of wringing our hands and worrying about the poor slippery slope.

It's not me who's "wringing my hands" - I'm not promoting or supporting any policy change, you are. I'm simply asking questions.

> Deplatforming is not censorship.

https://en.wikipedia.org/wiki/Censorship

"Censorship is the suppression of speech, public communication, or other information, on the basis that such material is considered objectionable, harmful, sensitive, or "inconvenient". Censorship can be conducted by a government, private institutions, and corporations."

Well look at that, once again our definitions don't match. Care to share the URL for the authoritative source you're using?

> You are not guaranteed an audience for your speech.

As I said: there is a distinct difference between the First Amendment and the general principle of free speech. You only have a "guarantee" to free speech in the limited context of the first amendment.

> This is irrelevant to the discussion at best and disingenuous at worst. Facebook can ban multiple kinds of content, including pro-ISIS content (and it does). White nationalism being banned is what we're discussing here.

I thought the principle behind your argument was the harm - maybe the principle is race after all?

> I believe that social media is helping incite these incidents, so I do legitimately believe the amount of people participating in hate crimes will be reduced as a result of this.

A perfectly reasonable theory, what evidence of this do you have? Are you well read on the topic, or is this just a casual opinion that sounds about right?

> Do you really think that people aren't radicalized by communities on the Internet? Communities that might exist on Facebook?

I certainly do, but I'm concerned about everything that could lead to radicalization. History is full of examples demonstrating the strange and complex behavior of individuals and civilizations. Are we over-simplifying this situation? Should we freely discuss such things with open minds?

> No discourse of value is being lost by banning white nationalism

Completely agree, but is that all there is to it? Can you think of any possible undesirable reactions (regardless of the soundness of the logic underlying the motivation) to this type of policy?

> If you want to read white nationalist speech, it still exists. This merely means you can't read it on Facebook anymore. How this is a tragedy I simply cannot understand.

The tragedy is if this further inflames emotions, "righteously" or not, and someone shoots up another church. That's what I'm worried about.

> Yes, it's so. Read the manifesto.

The link I included sure didn't make this fellow sound like just another stereotypical redneck racist as you claim, at least my understanding of that stereotype. But never mind, let's once again not bother ourselves with messy details when vague stereotypes will suffice.

>>> They believe that by removing these people from their site they might reduce incidents of these shootings. Do you think they're wrong?

>> They might be right, they might be wrong, unlike you I make no claim to know. What if they are wrong, and censorship aggravates these already mentally unbalanced people even more, driving them to organize (perhaps in the physical world) underground where it's difficult to see what they're up to or thinking? Is this worth considering when setting policy?

> No.

Well then, this well answers my "Do you care?" questions as well.

> People can (and will) do anything in response to anything; maybe white nationalist groups will make Zuck their #1 enemy over this. You can't control the actions of other people, especially crazy people, so catering your policies to them seems absurd.

If everything is random, why even bother with policies like this, or any at all?

> I don't really see many legitimate concerns arising from banning white nationalism from Facebook.

I obviously disagree, but if you believe the reaction of extremists is not worth considering, at least you're being logically consistent.

> I just don't see what's being lost here, except white nationalist content on Facebook.

If any future negative reactions (in part due to this type of policy) of extremists are excluded from consideration, you are completely correct.

> who cares?

People attending a future church service might, if there is any validity to my belief that people aren't as simplistic as you think they are. But we'll never know for sure, because like you the media seems to prefer to not bother with messy details of why people commit atrocities when a simple stereotype works (and certainly sells more papers as an added bonus).

> Make another site if you want to discuss what they deplatform.

A perfectly fitting conclusion.




> This suggest the problem is my lack of ability to understand, when the problem is there is no longer a common definition.

The only misunderstanding seems to be yours. We can clearly identify white nationalists as actual Nazis, no redefinition required.

> Censorship can be conducted by a government, private institutions, and corporations.

No one is saying that corporations can't censor things. I am saying that this is not censorship. Facebook deplatforming you is not the same as censoring you.

> maybe the principle is race after all?

The only thing we're discussing here is the deplatforming of white nationalist content from Facebook. Bringing other races or religions and their extremism in is totally tangential at best, and actively disingenuous at worst.

> Are you well read on the topic, or is this just a casual opinion that sounds about right?

There are many examples in the news recently. Here's one I read just the other day: https://www.washingtonpost.com/politics/2019/03/22/trumps-rh... White nationalist speech creates hate crimes; the correlation appears clear to me. Of course, it also seems logical to me without any sources, so sources are just a nice bonus.

> Can you think of any possible undesirable reactions (regardless of the soundness of the logic underlying the motivation) to this type of policy?

Essentially your argument seems to boil down to: we shouldn't create good policies because crazy extremists might have bad reactions to them. But crazy extremists have bad reactions to everything. We should be trying to deplatform and deconvert extremists instead of catering to their sensitive tastes. If that offends them and causes them to lash out, that is unfortunate, but they'll do that anyway. At least if they do it to this there might be less of them in the future.

> If everything is random, why even bother with policies like this, or any at all?

While you can't control how people react to what you do, you can do what you believe is right and hope it has a good outcome in the future. Facebook apparently agrees with me.


> The only misunderstanding seems to be yours. We can clearly identify white nationalists as actual Nazis, no redefinition required.

a) With no standard definition of the phrases?

b) Some people may be able to, but can Facebook accurately and fairly (despite no standard accepted meaning of many terms) police speech (remember, we're not dealing with people wearing white hats at a rally, we're dealing with speech, which is subtle), at scale? Sure everyone can agree at the extremes, but when it's close to the middle, then it's complicated. It's like "I know pornography when I see it."

> No one is saying that corporations can't censor things. I am saying that this is not censorship. Facebook deplatforming you is not the same as censoring you.

https://newsroom.fb.com/news/2019/03/standing-against-hate/

"Today we’re announcing a ban on praise, support and representation of white nationalism and white separatism on Facebook and Instagram, which we’ll start enforcing next week."

I'm not saying they don't have a right to do this, it is their platform after all. I'm not even saying that it is necessarily or certainly a bad idea. You and I disagreeing on this is fine and healthy. But how can you interpret "you can't say <x>" as not censorship? I asked you earlier for the definition of the word you're using, and you didn't reply. I ask again: please tell me the definition you're using for "censorship", with a link to the source.

> The only thing we're discussing here is the deplatforming of white nationalist content from Facebook. Bringing other races or religions and their extremism in is totally tangential at best, and actively disingenuous at worst.

No, that's the only thing the article is discussing. The HN discussion, and our thread in particular, are discussing broader principles of free speech and fairness, possible downsides of these types of decisions, etc.

>>> I believe that social media is helping incite these incidents, so I do legitimately believe the amount of people participating in hate crimes will be reduced as a result of this.

>> A perfectly reasonable theory, what evidence of this do you have? Are you well read on the topic, or is this just a casual opinion that sounds about right?

> There are many examples in the news recently. Here's one I read just the other day: https://www.washingtonpost.com/politics/2019/03/22/trumps-rh.... White nationalist speech creates hate crimes; the correlation appears clear to me.

"To test this, we aggregated hate-crime incident data and Trump rally data (a different variable than our topic of conversation, but again, no need to bother ourselves with precision or details) to the county level and then used statistical tools to estimate a rally’s impact. We included controls for factors such as the county’s crime rates, its number of active hate groups, its minority populations, its percentage with college educations, its location in the country and the month when the rallies occurred. We found that counties that had hosted a 2016 Trump campaign rally saw a 226 percent increase in reported hate crimes over comparable counties that did not host such a rally. Of course, our analysis cannot be certain it was Trump’s campaign rally rhetoric that caused people to commit more hate crimes in the host county."

You can find a correlation in data for anything you want to support, see: http://www.tylervigen.com/spurious-correlations

Now, stating that doesn't prove your claim is wrong, I'm just pointing out the one piece of evidence you finally provided is little more than an op-ed piece. We should be collecting more and better data on these things if they're important, so we can set evidence-based policy.

Erring on the side of caution (as Facebook is doing) is fine, but there's no need to tell lies in the process as far as I can see.

> Of course, it also seems logical to me without any sources, so sources are just a nice bonus.

This sentence explains this conversation, as well as the general state of political conversation in 2019: facts and evidence are considered completely optional.

> Essentially your argument seems to boil down to: we shouldn't create good policies because crazy extremists might have bad reactions to them.

No, that is one small point of my overall concerns (my "argument", if any, is that you won't provide any evidence for your claims, or assert that none is necessary when curtailing general free speech), and it's an important point. People are becoming incredibly politically polarized to the degree that it is causing strange behavior. Some people lose the ability to engage in logic & evidence based conversations on particular topics, others shoot up churches. Shit is pretty seriously fucked up and doesn't seem to be getting better. Being cautious and thoughtful about non-obvious risks seems like a good idea to me, not something to avoid.




Consider applying for YC's Fall 2025 batch! Applications are open till Aug 4

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: