But realistically, these people have been violating the ToS for a long time. Late enforcement is better than no enforcement. Give them an inch and they'll take a mile.
It's true. The only thing that truly changes behavior is consequences. Arrests for in-person insurrections, trials, convictions, deplatform, disrupt, and I would even say enforce the laws that we do have against incitement of violence, which means arresting people who are posting hateful shit online. This whole thing needs to be busted.
Not doing anything because they might be "upset" is tantamount to surrender. Then they literally are holding us hostage. That cannot be tolerated.
Agree, and will add that this should be backed by LEO's tracking online posts inciting violence to people's doors, arresting them, and letting courts decide if it's a free speech issue.
I absolutely agree as far as arrests and convictions for violence, incitements to violence, and conspiracies to commit violent acts.
However, I think deplatforming is a poor long-term solution. Deplatforming on mainstream platforms inspired and fueled the rise of Parler. Deplatforming gets rid of all carrots and all future sticks the mainstream platform has against a given extremist. I think 1-month to 3-month suspensions are more likely to keep followers around, where the mainstream platform has an opportunity to suggest a mixture of center-left and center-right content to them, and provide carrots to rehabilitate extremist publishers.
As I've posted elsewhere, taking Parler offline is probably a necessary short-term solution to an acute threat, but I think it's rarely the right long-term solution.
Edit: I'm fine being downvoted, but if you have a good counter-argument, please post it or a link to it. Try to change my mind instead of just venting your anger.
They won't have a unified place to plan for their next attack in a few days? Where do you think they planned their attack for January 6? Deplatforming has proven to work, if anything heavily slow down the spread of misinformation. What's the last time you heard about Milo?
The planning wasn't done there, just bloviating. The planning was done on telegram and signal (and to a lesser extent Keybase and wire). Lots of people doing osint on the events could tell you that.
Interestingly, the banning of all these accounts has caused the size of the telegram groups to increase dramatically and the discourse to get much more violent.
As I've suggested elsewhere, I think 1-month to 3-month suspensions are much better than lifetime bans. Suspensions keep followers hanging around the mainstream platform, where you can suggest centrist content to them.
A lifetime ban gets rid of all carrots and all future sticks the platform can offer.
Sure, but do we really want to platform to take an active and knowing hand in shaping politics?
Right now they have an effect through algorithms, but their actions don’t seem to be coordinated towards any goal other than increasing engagement with the platform. This is actively different than pitching centrist content to extremists because the platform owners are centrists themselves.
What if the platform owners become extremists like Parler?
Up until 3 or 4 years ago, I was nearly a free speech maximalist. Now, it seems clear that society is poorly equipped and adjusting poorly to social media.
I think giving more control to centrist platforms is the least bad of all the bad options I see. I'm not sure how to mitigate the long-term risks introduced, but the bifurcation of society appears to be a slow motion car crash in progress.
I'd love to be convinced that there are better options, especially if they allowed me to move closer to free speech maximalism.
I think it would help to change middle school and high school civics curricula by adding cognitive biases, logical fallacies (including some basic statistical fallacies), and other tools to better handle social media.
I think even many of Milo's fanbois realized he had jumped the shark even before he got deplatformed.
I think shutting down Parler is a necessary response to an immediate and concrete existential threat to the long-term functioning of the American democracy.
However, the deplatforming of Milo was part of the impetus for the creation of Parler. Keeping Milo on and suggesting center-left content to his viewers would have been a better long-term solution, along with suggesting center-right content to viewers of extreme left content. (Edit: actually, come to think of it, you'd want a mixture of center-right and center-left content suggested to both extremes, to make the medicine taste a bit less bitter.)
Deplatforming may be necessary for immediate threats, but it's not a good long-term solution to the problem of polarization. Long-term, deplatforming feeds the conspiracy theories and pushes viewers further from the reach of mainstream platforms.
I'd encourage you to read "Talking with strangers".
There's a section in there about the anchoring effect, which basically talks about how if you increase patrols in the worst parts of town and get crime there down, it mostly doesn't just move "elsewhere". These people had good reasons for being drawn to parler and while it won't stop anybody, it's a significant blow I think.
The counter-argument is that the main draw of Parler was deplatforming (of people who weren't calling for violence) on mainstream platorms. In hindsight (and for some, in foresight), finding some pool of center-left content to suggest to viewers of far-right social media personalities, and center-right content to suggest to viewers of far-left social media personalities probably would have been a much better long-term solution. I also think you'd want to have 1-month or 3-month suspensions for suggesting violence, rather than full deplatforming, to reduce the number of followers who jump to more extreme platforms.
Edit: actually, I think you'd one pool of center-left and center-right content to show to both extremes, to make the medicine taste less bitter.
Vice did a very interesting look into this question following the deplatforming of Alex Jones[1]. As is evident from their article title -- "Deplatforming Works" -- the data seem to say it works.
As for the specific concern you mention, they look into it as well:
"'The good that comes with deplatforming is, their main goal was to redpill or get people within mainstream communities more in line with their beliefs, so we need to get them off those platforms,” Robyn Caplan, a PhD student at Rutgers University and Data and Society affiliate, told me on the phone. “But now we’ve put them down into their holes where they were before, and they could strengthen their beliefs and become more extreme.'
The question is whether it’s more harmful to society to have many millions of people exposed to kinda hateful content or to have a much smaller number of ultra-radicalized true believers."
From what I understand, deplatforming actually works to deradicalize people because every time you deplatform, the actual radicalized group lessens. For example, randos from twitter who went to parler to follow their favorite alt-right opinion producer but otherwise don't care about politics probably won't want to follow them a second time and will end up entering less radicalized spaces and then deradicalizing themselves.
At the end you have a small group of severely sick people who can only be helped by therapy, but prevent mass unrest of otherwise normal people who got radicalized only because getting radicalized was as easy as joining a platform.
I agree with you, but I also don't know what else these companies should do. What happened on Jan 6th was coordinated on a lot of these platforms (including Parler).
It seems like the answer to all these types of issues comes down to education at some point. Not that we all need to learn the same point of view, but that we should learn how to think critically. The people who took part in the insurrection on the capital honestly believe a lot of falsehoods propagated on networks like Parlor, FB, Twitter. They aren't all acting in bad faith.
Seems like these networks/platforms have been forced to take the actions they did because people – all of us to some degree or another – choose what we want to believe, wether there's any evidence or not.
If we're giving that the above is true then the US is basically at war with these people. It's an insurrection.
"If we cut our enemies communications then that's just fueling the fire" is not something you say mid war. You cut and code break and infiltrate and capture and kill and anything else because the time for "deplatforming" and "fuel on fire" and any other business as usual peace time political concept is irrelevant. That game isn't being played any more, the violence game is.
It's like you're playing chess with someone and they just got up and shoved you, and you punched them, and then a bunch of onlookers are all "well what if they had played knight to f4? Would you punch then!? Punching shouldn't be allowed in chess!". Except the chess game ended when they got up and shoved you. Treating these people as political opponents ended when they broke in to the Capitol building with guns and bombs with the stated intent of killing politicians. They're enemies of the state and therefore enemies of institutions that exist thanks to that state and any individual that likes that state.
What do you think the "deplatformed" people are going to do? Just quit because "oh no our website is gone"?
This event is just fueling the fire IMO.