Erdogan's first speech was almost 2 days after the earthquake and he said "We are monitoring who said what on the social media and tried to provoke the people. Today it's not the day to go after those but we take notes and when the day comes, we will go after them".
It's like from V for Vendetta. The lighting choice is very particular.
Twitter was heavily used to seek help by people in turmoil, even hundreds of people were posting from under the rubbles. The officials were claiming that everything was under control and they are helping everyone but people were posting videos showing the situation on the ground and the situation didn't look even close to being under control.
The head of communications of Erdogan even introduced an app to streamline "reporting disinformation". Like, it's the second day after a massive earthquake and they published a f*king app to snitch people.
Here is the announcement of the app, early at 05:00 local time in the morning 24 hours after the quake: https://twitter.com/fahrettinaltun/status/162277720485259264...
Also there were many incidents of the mainstream media cutting of talks or turning away the camera when people said or did anything discrediting the official narrative.
Yesterday, some people with prominent accounts who shared the tweets from the people in the region began reporting that they were taken into custody by the police. I guess the day has come quickly.
This is one of many, many reasons why western countries and social media companies should not normalise censorship re-branded like "fighting disinformation".
Curiously, all the people that support such censorship in western countries are actually against it in this instance. Almost like whether or not they choose to support a policy depends not on what it does but instead on who it’s being done to.
> Almost like whether or not they choose to support a policy depends not on what it does but instead on who it’s being done to.
Well, or they have a more nuanced view than "censorship no" or "censorship yes"
For example, I support laws preventing stores from selling 3 watt LED bulbs in packages claiming they're 5 watt LED bulbs; or selling generic LED bulbs in packages claiming they're Phillips bulbs. Even though the printed text is surely the manufacturer's speech.
Must I support all censorship, because I support this one bit of censorship?
Your scenario is not equivalent. A buyer and seller shake on an implicit contract when a purchase is made, and enforcement of a contract you agreed to is not censorship. A more accurate comparison would be me posting a picture of a 3 watt bulb online, claiming it is a 5 watt bulb. I don’t believe that should be illegal, and you would if you were consistent.
You've missed the point. It's not desirable to have listings which contain false advertising; the remedy for that is to restrict false advertising, not make every buyer chase the seller in a contract dispute after the fact. That would be insanity.
Nope, your comparison is just not equivalent. A sale is a transfer of property. We enforce the implicit contract that goes along with that. The buyer will pay, the seller will deliver, the product is accurate, etc. This discussion is about speech, for which no such contract exists. If you disagree, then you should have no problem arresting people for showing off a 3 watt bulb they claim to be 5 watts.
The poster's point was pretty simple: there is some speech which doesn't deserve to be fully protected to the highest possible level. Untrue commercial speech like false advertising is pretty much in that category. This is why we allow attorneys-general to bring cases against false advertisers, for example.
Usually the remedy is monetary damages, but injunctive relief is also available.
I can easily maintain that this is sensible and reasonable without "arresting people for showing a 3 watt bulb they claim to be 5 watts."
Isn't the crime of false advertising in the failure to deliver what was claimed, and not the claim itself? You can very well falsely advertise your product however you like provided you don't actually sell it to anyone, because once you deliver them a product that isn't what you claimed, that's when the actual crime occurs.
Is there a sale or not? If there's not a sale it's not advertising. If it's not advertising it should be allowed, false or not. If it is advertising it's not just speech. You lack the consistency to make your argument work. So again, I guess you're fine with arresting people over light bulbs.
This is a really strange perspective. Advertising is only advertising once the advertised product is sold? A billboard is simultaneously an advertisement and not an advertisement depending on whether the observer has purchased the advertised item?
Not the GP but no, I'm not fine with arresting people over light bulbs. I am totally cool with arresting them for fraud if it meets that standard. I am also cool with the business being fined or otherwise sanctioned for false advertisement. The product is beside the point. The deception is what the punishment is intended to address.
There’s an implicit conversion between intent to sell and actually selling. For there to be advertising, one of the two has to be true. If I’m showing off my car, am I advertising anything?
You've introduced this 'intent to sell' bit when for most of the thread you were requiring the transaction to proceed to a sale before the advertising can be advertising.
No one has claimed anything about advertising without the intent to sell. Indeed, that's been everyone's point since this bit of the thread started with michaelt's comment. Commercial speech is an area were most people agree that some constraints are useful. Hence it was used as an example of why censorship is nuanced not binary. People have different standards for what's reasonable and what's not, but only the most die hard free speech advocates would not have a standard at all.
You have frequently taken examples that are clearly in the advertising with intent to sell context and treated them as if they apply in a non-commercial context. This is how we get to you accusing people of being OK with arresting people over light bulbs when they're actually saying they're OK with laws against fraudulent advertising.
As I said in another comment, there’s a conversion between intent to sell and actually selling. The same way if you were against murder you’d implicitly be against conspiring to murder.
If you are selling or intending to sell something, and you choose to have it mediated by the government, whether or not you’re allowed to falsely advertise something has nothing to do with freedom of speech. We aren’t agreeing that censorship is allowed. Censorship isn’t voluntary.
No, I’m saying that if you decide to make a sale, and decide to have it mediated by the government by paying sales tax, then whether or not you get punished for false advertising has nothing to do with freedom of speech. Whether such taxes are voluntary in the first place is a different argument, but I think they should be. So there’s no inconsistency on my side.
It sounds like you have replaced a moral framework with a legal one. But the question of censorship is fundamentally a question of morality. It matters not how legal or illegal specific actions are in this discussion, so the legal means of enforcement that you rely on for your argument are a red herring.
Then you heard incorrectly. What I said is what I believe to be moral. You can’t arrest someone over non violent speech. Force is only justified in response to force. And at the same time, if you agree to enforcement you aren’t getting your rights violated.
Neither "enforcement" nor "arrest" carries moral content. But you use both terms freely in justifying your allegedly-moral position, unfortunately implying your conception of "rights" is one based only on those granted by an enforcement body.
It is hard to discern any moral content here, and I suspect there is none.
Not a lawyer but I could imagine a situation where you're in a store and you see an advertisement for a blue light bulb that says that blue light bulbs are proven to disinfect surfaces. You remember you have some blue light bulbs at home, so you go home and screw them in. Or maybe you go on amazon and order some blue light bulbs because they're cheaper, even though different brands are not making that claim, the first one tricked you into thinking it was true. You haven't entered into any contract with the manufacturer making that false claim, yet you are still harmed by it.
Right, but people (and companies) lie all the time, and it's not something that's illegal. It's amoral, but not illegal.
I'm not sure where I stand on the matter personally.
Like, taking the constant social turmoil in America as an example: anything people dislike they call disinformation/fake news, but are equally guilty and okay with of spreading and allowing the same behaviour if it furthers their agenda. Or they will at least be more lenient to their side of the arguments indulging in it, since it aligns with their biases.
I've seen this happen from both right wing and left wing members one Twitter and on Reddit. I personally don't use these platforms, but I've seen various threads with insane amounts of hypocrisy.
There's so many nuance and unresolved problems.
Like, how do we know truth isn't what's being censored?
How can we even tell in this day and age when the truth is? It's often diluted with some hidden agenda. Either political or corporate (or oftentimes both). I can't sit for hours fact checking everything I read from different sources. I have a job, and I also don't want to spend all my free time on it. But on the other hand it's also topic I care about (just not THAT much I guess?). Who to trust?
And I feel like this "the ones in power control what's true" generally feels like a slippery slope.
I also hate that I'm beginning to sound like a crazy tinfoil-wearing person.
Idk, humans are complicated. Social media in its current iteration was a mistake, since it created platforms for some very dangerous and narcissistic people
In a similar vein, there is a "fruits and veggies" vitamins commercial that advocates skipping meals to afford the vitamins, as they give you all you need.
If there is to be no sale, then is it really advertising at all? If I "advertise" that I own a thing with no intention of selling that thing, then laws about honest advertisement don't apply to me.
My car (not for sale) has a nuclear reactor and can travel through time.
> What can my company do if a competitor is running an ad that I think is deceptive?
> Explore your legal options under federal and state statutes that protect businesses from unfair competition. For example, the Lanham Act gives companies the right to sue their competitors for making deceptive claims in ads.
> Are advertising agencies subject to the FTC Act?
> Yes. In addition to the advertiser, the advertising agency also may be held legally responsible for misleading claims in ads. Advertising agencies have a duty to make an independent check on the information used to substantiate ad claims. They may not rely on an advertiser's assurance that the claims are substantiated.
I don't see anything about false advertising requiring a sale.
No, his scenario is not equivalent, but here's a few more:
"Hello, this is just a reminder that your voting station will be open between 5 pm and 9 pm in (somewhere where it's not actually located), please remember to vote!"
"Hello, if you would like a free ride to your voting station, please text us 'YES', and wait at (location) between X and Y pm (Nobody will show up)."
"Hello, we have a great offer for auto insurance, blah, blah, blah."
"Hello, just a reminder, millions of trustworthy people believe that <opposition candidate> was responsible for <something untrue and horrible>. This isn't slander, because we are just strongly implying it in this robocall. Also, they live at XY address, and won't someone rid is of this meddlesome priest?"
"Hello, all the doctors are lying to you, buy our snake oil wellness supplements, instead. They are supplements, not drugs, we don't answer to the FDA."
"Hello, let's go down to sixth and Broadway next Tuesday, and make some noise/put the fear of God into <group>"
"<Ethnic minority group> is burning this country's forests down using space lasers."
"Hello, please be aware that it's illegal to discuss your salary with your coworkers."
Just so I'm clear, is your argument that these statements should be censored?
Lots of these seem reprehensible, but totally within a reasonable space of legitimate free speech. Barring the cases with calls for imminent harm (e.g. turbulent priests and such), it seems that there's a big risk in trying to draw a legal line here.
In your polling station example, for instance, that has some clear negative impact. If I instead said "The library will be closed for the next month", is that protected though? What if I say "Arby's will be giving out free sandwiches from noon to two o'clock tomorrow"? All of these statements have negative impact.
The rationale for more extensive free speech is that it is expected that individuals have the right to hear what others are saying, and evaluating how bogus those are.
To be clear, some censorship is absolutely accepted in society today (e.g. libel laws, imminent threats, marketplace standards, etc), but the fear of an ill-defined line that can be shifted to suit political winds seems like a very reasonable one.
Except for the last, those are just lies that people in power find inconvenient. People were saying way more ridiculous things in the summer of 1787, and yet the country I live in enshrined free speech rights.
They also enshrined slavery, assuming you mean the US. Who cares what they enshrined? They have no credibility. If the merit of the argument holds no weight, then the fact that “the founding fathers” made it doesn’t make it more (or less) correct.
That depends entirely on who does it. An individual versus a government official.
The overarching point still stands. That people being in favor of censoring "disinformation" (ie speech by private individuals) in certain instances but not others is an inconsistent stance that carries a distinct appearance of partisanship.
Half of these scenarios are just variations of the comment I replied to. Again, there's a contract of sale. Your speech isn't being violated when you sign a contract promising you're selling what you claim to be selling.
As for the rest, you're free to make whatever lies you want about others. Free speech doesn't allow you to initiate force against others so I'm not particularly concerned with any of it. We have gun rights to deal with people that try to enter your house or attack you over your skin colour and space laser accusations.
This is you fundamentally misunderstanding what is meant by "speech". The statement itself is not illegal. The act of misrepresenting the transaction being agreed to is what's illegal.
A specific act that involves many components, only one of which happens to be speech, has been made illegal. The speech component itself has not.
There are certain situations where speech itself is restricted. This is not one of them though.
Saying these bulbs have 5 when they have three is a lie but shouldn't be censored. Selling someone 3 but saying it is 5 is fraud. Selling/buying have different laws and standards.
Exactly this. Censorship should be a nuanced topic if anyone gives it any serious thought. Not all censorship is at the same level, and there is obviously a difference between shutting down Twitter and not letting folks criticize the government and, say, not telling folks misinformation that can lead to another's harm or death (fake cures, for example).
The problem with allowing the interpretation of whether something should be censored be based on the obvious nuances that it entails is that the exact nuance is not universal among all people. This is why ZERO censorship is the only reasonable form of censorship.
It would be interesting to get your opinion on censorship again if a specific lie was gaining exponential traction and threatened the livelihood of your family. Or even if there were something less direct, like if you couldn't go see a movie in a theater because a huge tiktok trend of yelling "Fire!" and filming people trample each other gains the spotlight of millions of users.
And it would be interesting to get your opinion on censorship if saying things you believe in meant the government would hunt down and execute your entire family.
Of course that is a ridiculous scenario, as are most straw man arguments.
And the "shouting fire in a crowded theater" trope was original made in support to argue for censorship of speech opposing the draft during WWI so if anything it's an example for why we should absolutely defend free speech even if there are seeming arguments to restrict it. The better solution to prevent stampedes in crowded spaces is to have safety regulation requiring adequate exits and occupancy limits, which we do have. Movie theaters are also not public spaces so they can easily ban people for causing a disturbance.
I understand that you mean zero censorship for adults. I am interested in your position regarding people younger than 21/18/16 (exact age depends on the country I guess).
I worry that the nuance here is not actually down to interpretation, but the consequences of extremely rapid dissemination of information. I don't even think those of us in tech have come to grips with how that matters, let alone if/how we should make efforts to get around it.
The latency on information coming from a newspaper or even TV news is limited to how rapidly those sources can vet the information before publishing it, and then the latency of the publication method. If someone wants to use those avenues to spread misinformation, they have to do a lot of legwork to make the lie sound plausible enough that the publisher will put the effort into publishing it.
What's the legwork for someone planning to spread misinformation via twitter? How long does it take "for new facts to emerge" that address the misgivings incredulous consumers hold? The power of social media is that the real truth can emerge very quickly, regardless of what the powers that be want. But the threat of social media is that there's basically no organic way to prevent abuse of that power.
Others are right that the line between misinformation and difference of opinion is often quite narrow. But I don't believe there is no difference, and every society must choose and hold its line, according to its own values.
I have trouble grasping the extend to which you appreciate the degree of thinking done by other parties for your own benefit.
It is the case that unless you are a polymath with copious amount of time and money, you will never be able to accumulate sufficient information to make fully educated decisions in a world truly devoid of censorship.
By fully educated I mean starting from the bottom up and going all the way through to reach a conclusion.
This means that every single resource you will use has to be verified, all experiments need to be conducted and evaluated.
If we don’t censor anything, then there is no point in books because editors are censoring information there.
What’s the point of journals and publications without a review process to filter out the garbage?
You could very well be a reasonable person, but without censorship to varying degrees your capacity filter garbage is hindered by simply not knowing anything to a reasonable degree of certainty exactly because no form of censorship exists. Thus your capacity to limit information based on prior knowledge is hindered as there is not an authority of any kind that you may trust exactly because there is no way to measure quality of one’s speech outside the scrutiny it passes through through established channels and authorities that already have a reputation and we may trust.
The reason we do censorship is because not all speech is benevolent, useful, or true. People will abuse the absence of censorship to attack “unwanted” people, to push lies, to misrepresent information, and to push an agenda instead of letting facts do the talking.
Why is it difficult for facts to do the talking?
> Brandolini's law, also known as the bullshit asymmetry principle, is an internet adage that emphasizes the effort of debunking misinformation, in comparison to the relative ease of creating it in the first place. It states that "The amount of energy needed to refute bullshit is an order of magnitude bigger than that needed to produce it."[1][2]
Now, the more bullshit that floats the more bullshit propagates and thus the less true information exists, eventually suffocating us in information garbage.
If you wish to see this in action, look into antivax groups convinced that vaccinated individuals will start dropping like flies within the next $LATEST_GOALPOST.
Or, take a look at how long it took me to explain why absolute absence of censorship simply does not work just from a view of having a consistent world model and having nobody to trust and no way of doing so.
One could extend my comment to include damage done to people or entities intentionally or otherwise. Notice how Musk pivoted out of absolute free speech when advertisers left?
I have seen the same thing, but I think the cause is more that maybe the overwhelming majority of people can’t meaningfully do abstraction? Like they blow a gasket between conceiving of individual actions and beliefs vs policies related to actions and beliefs?
So people like this will say sentences like “censorship is wrong” but implicit in the sentence for them is “censorship of correct and/or my information is wrong” because the sentence in their mind must be concretely about something like “correct information”. There’s no way for them to easily think the thought “censorship is wrong in the abstract.”
This also feels related to being able to use “veil of ignorance” type reasoning. I don’t think they know how to imagine being not only a different person, but an unknown person.
I also don’t think they know how to imagine being wrong in some sense, not in the epistemic humility sense (although, yes, that too, much of the time), but in the “suppose I’m just deeply confused about everything, how would I want to be treated? How might I recover?”
I think all these are bottlenecked on a mental abstraction ceiling, and I think all these things work together to contribute to the effects you noted.
The way American public schools (and I presume other schools) teach kids about censorship and propaganda probably has something to do with it. It's not "propaganda" unless it's telling lies. And it's not "censorship" when it seems justified. Propaganda is taught to be synonymous with the promulgation of falsehoods, but in reality propaganda will tell the truth whenever the truth is convenient to the propagandist. To a propagandist, truth and fiction are simply tools to be used whenever either helps the propagandist accomplish their objective.
I agree, probably almost no one truly believes censorship is wrong in the abstract. Similarly, very few people believe in absolute freedom. I’m not sure there are really any belief systems where those two things can be really consistent. There is a real issue with people wrap themselves up as absolutist ideologues while spouting “illegal for thee, but not for me.”
IMO the issue in the entire discussion of "censorship" shouldn't be between censorship vs no censorship but about the outrage amplifiers social media has become. It makes the already huge problem with Bandolini's Law / "flooding the zone with shit" so much bigger by amplifying random voices that drive engagement which is most effectively driven by outrage. I'd wager that addressing this issue will suddenly reduce the number of discussions about "censorship"
Government censorship (and arrests!!) are in no way comparable to the “censorship” happening in western countries. (Namely, ToC violators getting banned, and government figures suggesting misinformation to look into.) The comparison is just obnoxious.
It’s not. Traditionally the government is in charge of the public square. If a private company is enforcing TOS violations while operating a public square, and operating a public square is traditionally a government function, it’s valid to argue violators are entitled to free speech protections.
Starting with you, do you support social media censorship in the United States? And do you support social media censorship by turkey? Those are yes or no questions.
Not really, no. All three cases (US Gov, Turkish Gov, Elon) are censorship of the public square in the end. I'm against all of them. As far as I'm concerned the US constitutional standard for free speech should apply to any platform above a certain size in terms of daily active users that facilitates communication between those users.
It's almost like the goal is not censorship for censorship's sake, but rather the goal is actually protecting the more vulnerable (which in this case is clearly the Turkish people and not Erdogan). It's almost like there are principles at play!
My principle is that force is only justified in response to force, and speech is not force (unless it makes a threat), false or not. If you don’t agree with that principle then I’d like to hear it explicitly.
I don't agree that speech is not a force unless it makes a threat.
Think about what makes you consider "threatening speech" a "force": presumably it has nothing to do with particle physics... is it perhaps that the act of speech itself causes people to adopt behaviours they wouldn't normally adopt? ie.: that it can carry some coercive weight? That it can be used to exploit/endanger vulnerable people in some circumstances?
Speech may be a very very weak force in most (or even "nearly all", just for the sake of argument) circumstances, but there are tons of examples of speech being a powerful force, causing large social and historical shifts. There are both positive and negative examples of this.
I don’t consider threatening speech a force, I’m opposed to it because it threatens force. You can argue that’s an exception to my principle but it’s not far out of line. I don’t try to ban other forms of speech, however much they hurt your feelings, because they don’t threaten force.
Hurting feelings is unfortunate, but generally not considered to be a reason to suppress other people's speech: we rather choose to suppress our own speech when we think it might hurt other people, and choose to do so freely.
Almost all "contemporary reasonable censorship in functioning liberal democracies" are attempts to prevent people from exploiting the vulnerability of others: speech that causes people to make *patently insane and irrational* medical decisions, for example. Really stupid things like fake cancer cures, and suggestions to drink bleach. And typically, for proponents of censorship, there is a whole spectrum of acceptability as well: the more it's a grey zone, the less the censorship is acceptable. When the censorship clearly and obviously only constrains the actions of malicious actors, and clearly and obviously protects vulnerable people, it's seen as a win. This is always highly contextual, and limited by the extent of scientific knowledge.
A common theme is that censorship to protect those in power (the government) is bad, and censorship to protect those who have the least amount of power is... well, not great (it's definitely always better if it's not needed), but not bad in the same way.
Speech can’t hurt you. Speech can threaten to hurt you, which I already stated I’m against. Other instances of censorship are blatant violations of the principle that force is only justified in response to force. Telling someone to drink bleach is not force. You’re trying to bend definitions to be able to make it seem equivalent to forcing someone to drink bleach, despite no force being involved.
Not sure that would make a difference. Western societies have an issue with genocide, so Russia claims Ukraine was committing genocide against Russians. If Westerners didn't care about genocide, Russia would've claimed something else, Putin certainly has no issue with genocide.
So if our thing du jure wasn't misinformation, I'm sure they'd just claim they're using their powers to combat hate speech or whatever else is culturally okay. If there's really nothing available, there's always "we need to protect the children from predators who are trying to abuse this dire situation". In the end, it doesn't matter if you're in power. Your tribe will always be with you, no matter the reason you're giving, and the opposition will be silenced.
You probably don't understand nuances of why Russia says what, it is not only about the western values, but just a very effective tactic in the information warfare.
I know about at least three major points:
* Russia always blame other party of the crimes it about to commit. It ties the enemy and allies into the "you did, no your did" kind of arguments. This also helps to push all sorts of fake narratives into the enemy population. We witness this over and over.
* It is a power play. Russian people enjoy very much the fact that their officials can say and do whatever they want, and the world can do nothing. This gives Russians true joy (as witnessed first hand). It also demoralises enemy heavily, because instead of pointing to the obvious crime, "not everything is so clear" now.
* It cements alternative history. For example, children in school all over Russia will learn about the current events based on the "official" alternative reasoning.
I believe the Western values are what makes it an effective tactic, and it's the same reason why they're calling the Ukrainians Nazis while simultaneously running towards fascism: if you say that X is a Nazi and is committing genocide, you'll have a visceral reaction in the West that creates sympathies/doubts of other narratives in at least part of the population (this will not work the same way/to the same degree in non-Western societies). If the West was fascist, I'm sure Russia wouldn't be talking internationally about Nazis and genocide because it wouldn't have any impact.
You're very right about the historical narrative as an internal component. Connecting current conflicts to WW2 will make people fall in line much more quickly than saying "it's about natural resources that were mistakenly buried on the wrong side of the border and also we want a land connection to Crimea". Much like Americans will always invoke freedom because it's part of their national identity and people are less likely to question why they're invading Iraq if you mention it. Russia being threatened by fascist forces in the West is a continuation of the existing historical narrative and neatly ties into it.
It's definitely a much better PR strategy than Germany had in Afghanistan during Operation Enduring Freedom where our secretary of defense prominently said that Germany's security was being defended at the hindu kush. It's makes sense in abstraction ("we need to confront those who disregard human rights even if they're 5000 km away") but it's not concrete and it didn't increase support for the war at all.
Maybe... but Russia has for decades carefully tended to the WW2 mythos of Soviets defending the world against nazis.
The anti nazi propaganda is not only for foreign ears, but for domestic consumption as well. "Us against the world, who is out to get us. All current problems stem at its root from the blow dealt to us during the Great Patriotic War."
However disinformation may very well lead to totalitarian regimes, propped on distorted reality and fake news. A democracy must be able to defend itself.
Democracies have always survived by making their subjects invested and empowered in the future of their country. There's a limit to how far disinformation can go, and, frankly, the only thing people are trying to achieve with censorship is moving that line the wrong way.
The spread of wrong information has always been overwhelming. This is nothing new, nor is it (imho) more irritating now than is was 20 years ago. And if the history books are accurate on how bad it was 100 years ago ... This is not what's destroying democracies.
Sad to see this being downvoted. Rational voices have been censored on many western platforms for saying things that went against the popular narrative, the most recent obvious example being during the pandemic. Some of the things censored were batshit crazy, some of them not, but blocking discussion and letting those in positions of power decide what is "disinformation" and what is "information" isn't the way to deal with the crazy stuff.
These companies are already in the dark deep of hyper partisan censorship. Take reddit for example. Nearly every sub is a left wing echo chamber, with essentially the same comment being made, and opposing viewpoints hidden, shadow-banned, or deleted under the guise of 'disinformation' even if they are factually accurate. Some of it is partly driven by liaisons within US and other governments, and supported by media. That is both problematic and terrorizing. Media was always supposed to be a counter-balance to government over-reach, but today, media is in on the scheme.
Except you can start your own subreddit and choose your own moderation policies. The inaccurately named /r/conservative has a lot of traffic and there are many more.
Being able to start a subreddit is meaningless, when right-leaning subreddits are systematically shut down, or restricted to the point of becoming inoperative, as soon as they pass a certain threshold popularity.
That's a funny joke considering 99% of subs which don't carry water is eventually banned for "misinformation" and is filled with bad actors and sock puppets.
That depends where one finds oneself on the political scale. A communist is only going to find neoliberal semi fascist conservative consensus where someone on the extreme right finds a socialist echo chamber.
What reasonable viewpoints do you perceive are being deleted and accused of being disinformation?
Stuff like /r/shitredditsays exist/existed to point out reddit being misogynistic, and capitalism is the name of the game for all the large subreddits.
I don't think you'll find a large subreddit that is actually leftist. It's just centered on what the population thinks, both for overall trends and for the reddit user base.
This doesn't match US politics because US politics don't match popular opinion, driven instead by the arbitrary layout of states and congressional districts
Exactly! In the US we routinely have "blessed" media organizations openly discussing the need to legislate/limit threats to their ability to "control the message"...but Erdogan cuts to the chase and everyone flips out.
I mean...the day Musk took over Twitter, lots of comments here were calling for twitter.com to be kicked off the internet...
I am old enough to remember when Turkish army was overthrowing its government on behave of the West and abusing its own citizens, none of you bleeding heart liberals where shedding any tears.The world does not subscribe to liberal secular values, and Muslim world does not. You can censor all you want, but when Charlie Hebdo published cartoons making fun of earth quakes and while ugly hordes of Indian gloating over the death and destruction, it is only prudent to censor this kind of garbage.
If you are unhappy living in the West, there is no reason whatsoever why you cannot move to a country with more compatible values. Russia is accepting new immigrants and that is certainly an option available to you.
This is a bit like saying "the darkness is bad, so the solution is light, and if the light ever becomes blinding, the solution is more light". Speech is not the point, it's just a medium for useful information. More speech is only good to the extent that it proliferates useful information. If useless spam speech is used to overwhelm useful speech, then that undermines the whole point of the principle of free speech. And with the advent of modern technology, overwhelming the useful speech is easier than ever. Our historic ideas of free speech are wholly unprepared to deal with the overflow attacks that are possible in the modern age. Is it not "censorship" to shout over someone so loudly that they can't be heard?
> This is a bit like saying "the darkness is bad, so the solution is light, and if the light ever becomes blinding, the solution is more light"
Completely untenable analogy. Why people say is that the solution is more speech is due to the antecedent concept of the marketplace of ideas and that, by and large, ideas are sifted via free speech and the good ones last while the bad ones die. You cannot have this sorting process in a system that stifles free speech.
If you want to attack free speech absolutism, at least attack Mill's axioms, not some made-up poorly-formed straw man of an analogy.
Re: "the marketplace of ideas", there is a modern philosophy that ascribes mythical qualities to markets, but the fact is that markets are not magic, they are merely useful tools that are appropriate for some contexts and inappropriate for others. There are numerous ways markets can either fail or be worse than useless, and one of those ways is by obscuring information about the market, and it's increasingly trivial to use unfettered free speech to effectively censor any other market participants, thereby distorting the market.
> concept of the marketplace of ideas and that, by and large, ideas are sifted via free speech and the good ones last while the bad ones die
This might be the core of it then, because while I generally agree with the idea of Free Speech in the abstract, I don't think that it's been having good outcomes for society in the era of the cultic milieu, and I think that has to do with this 'marketplace' idea not really holding water. (Maybe I just don't want to admit that my sense of what makes a "good idea" is wrong, and the success of eg Qanon as a meme means that I should just change my thinking and accept it as a winning meme and therefore a Good Idea.)
The problem with alternatives to the "free marketplace of ideas" is that they are all (by definition) authoritative. And who's the best "authority" here? The government? Me? You? Who decides what ideas get to be censored? Imo, there's more problems with the authoritative model than the marketplace model.
Kind of like Churchill's famous quote about democracy: "it's the worst form of government, except from all the others that have been tried."
This was true in practice before technology made communication and synthesized speech trivial. Unfortunately for us all, we no longer live in a world where "more speech is unambiguously better" is true. Technology has ruined the meme of free speech. I'm not saying there are any unambiguously good alternatives; we are all diminished by its loss. But clinging to its corpse isn't doing anyone any good.
As for the Churchill quote, the reason that democracy has those qualities is because it inherently involves compromise. An uncompromising philosophy like "more speech is always better" is quite a different thing. We need a new rallying cry to replace the meme of free speech, probably something to do with the notion of signal:noise ratios, but it's beyond me to invent such a meme.
> As for the Churchill quote, the reason that democracy has those qualities is because it inherently involves compromise.
But it's the same with free speech: you said it yourself, there's a lot of noise, that's the compromise. You don't know what the "one true Good Opinion is" until you hear from everyone, and that might include racists or nazis or whatever. I don't really think technology has anything to do with it. In fact, the printing press probably made a much larger impact on speech than the internet did (and we got through that just fine).
I think a large part of the problem we have today is lack of attribution. It is too easy for participants to pretend to be 1000 voices instead of a single voice. I don't think being able to pretend to be multiple people is particularly helpful to the "marketplace of ideas". Conversely if I see/hear the same sentiment a 1000 times and can distinguish between "1000 voices" and "one voice a thousand times" then that is useful for me determining how much weight to give to those ideas when forming my own ideas.
The other nice aspect of attribution is that there is a factual answer to "who is speaking". I don't need to evaluate the merits to your speech in order to understand who is speaking. In today's world it may not be easy/possible to know the specific answer to "who is speaking", but by observing any speech I know that there must have been _someone_ speaking.
As for how to get to a world with attribution... not so sure what the options are that don't suffer a lot of the same problems. But it at least seems like an approach that doesn't require authoritative assessment of the underlying speech itself to yield societal benefits.
I see your “The solution to bad speech is more speech” and raise you a “A lie can spread around the world before the truth can get its pants on”. Call or fold?
Call. It's not about who's fastest, it's about who stays in the end. Despite us humans being a lying, cheating bunch, knowledge and access to education continues to grow all over the world. That's certainly not thanks to censorship!
Not so fast, pardner. The market can remain irrational longer than you can remain solvent.
You are making two mistakes here: the simplest one is thinking that the winner of the argument is whoever has the last word. This is very frequently not the case, as any troll could tell you. Often, the power move is to get opponents mad and then move on, leaving them impotently fuming about how terrible that person is.
You're also identifying with the larger group at the societal/species level, which is valid, but is also a way of ignoring problems in the here and now. It's like zooming out from a bloody battlefield to look at the earth in space. With an extraplanetary perspective, Earth looks so beautiful and peaceful...but that doesn't help a single person who is impacted by the bloody battle.
> It's not about who's fastest, it's about who stays in the end
And that's where authoritarian regimes get you. People usually aren't that invested in a conversation, or in disproving firehose of falsehoods style comments placed by people incentivized monetarily to make sure the correct opinion stays in the end. If the platform has rating system they also generate accounts to manipulate it for the same purpose.
>It's not about who's fastest, it's about who stays in the end.
This is contradictory. Not every reader is going to "stay in the end" to learn what's really true. The past few years have given plenty of evidence for that.
How do you define a lie? Please be precise. What source of truth is used to evaluate possible lies, and who gets to make the final determination? What level of confidence is required? Does intent matter? Does a statement count as a lie if it is factually correct, or at least not provably incorrect, but still potentially misleading or lacking relevant context?
Do you also unquestonably accept Putin's, Xi's, Kim's and Erdogan's elections results? What reason do you have to believe that the situations there are different other than what you have been told? I have no reason to believe that Trump won in 2020 but elections are something that inherently requires trust in the institutions running them which makes the result something very much different from an unquestionable fact if that trust is broken.
I really don't think that's the case. What makes you think so? In my experience exposure to more viewpoints, even flawed ones, increases understanding and helps critical thinking.
Experience. We've also basically seen in it in the pandemia all over the world happening in the last years. Though, this was a bit of an extreme situation for everyone.
> In my experience exposure to more viewpoints, even flawed ones, increases understanding and helps critical thinking.
Simple exposure is only helpful if the people are able to handle it, and willing to invest the time, and the one delivering it has no bad intention. The ones where it will not work, are left as the victims of this strategy. Maybe they will find their way after a long painful process. But then the harm is done.
> We've also basically seen in it in the pandemia all over the world happening in the last years. Though, this was a bit of an extreme situation for everyone.
You mean the pandemic where alternative viewpoints were systematically suppressed in the western world?
Everyone has a plan until they get punched in the face. A lot of social media dynamics are not reasoned arguments, they are the product emotional outbursts which can be manufactured and herded.
The world is not made up of Spock-like rationalists, and although most people are capable of rationality, many social settings are not conducive to it. Look at stampede disasters: every year people die because panic breaks out among a crowd in a constricted space and people start to operate on instinct instead of thought.
This is a very interesting perspective but I don't think it's an argument in favor of censorship. Rather I think it's an objection to the way in which current social media "spaces" are "laid out" in a functional sense. Similar to how building and fire code both take various dangers into account.
There's also an element of individual freedom involved here. If you choose to keep climbing into the boxing ring and then trying to have a reasoned discussion and failing, perhaps you are the one making poor choices. That doesn't necessarily mean that outlawing boxing rings or otherwise regulating who can participate when and how is either a good or workable solution.
This is disingenuous. I've been exposed to alternative viewpoints, saw what I consider obvious holes, and either expressed disagreement or disregarded them entirely. If someone can convincingly (from my perspective, not theirs) illustrate a flaw in my reasoning at that point I have changed my mind by definition of it being convincing.
When people say thing like "other people aren't interested in changing their minds" what they really mean is "other people didn't find my arguments convincing".
I mean this is well-known phenomenon. People not only disregard others' arguments, they even reject facts if they contradict a core worldview.[1][2][3][4]
You trying to rationalize this as "other people didn't find the arguments convincing" is a meta version of this. A deliberate ignorance of this cognitive bias even when given documentation of its existence.
That would only be the solution if the problem was bad speech. The problem is that algorithms optimise for engagement and therefore amplify increasingly extremist views. You cannot talk your way out of that position, it surely must include intervention?
These algorithms are a multiplier to the problem, but they aren't the root cause. You can find plenty of misinformation on cable news and you can't blame that exclusively on algorithms.
Not the root cause, no. I'd say the root cause is reliance on advertising for income because that means that you now need to draw as many viewers/users as possible, and that's why they start to create content which maximizes reactions or engagement.
Are you trying to say that biology is the root cause? The content is successful as it emits a reaction from us. Whether a good reaction or a bad one, we still engage with it. Whether you're binge watching a TV show you love or hate scrolling a subreddit that makes you angry, the content is designed to elicit some form of reaction that keeps you coming back or memetically passing it on.
However that is not a new development in our biology, we've always been that way. It's just that we're exploiting those aspects of ourselves more and more. Weaponising it to extract as much profit as possible.
You could go another way, I guess, and say that capitalism is the root cause as "profit at any cost" allows the justification of these actions.
There are two aspects to the root cause. You mentioned the first part, basic human biology pushes us towards content that elicits a reaction. The second part is that lies inherently have an advantage over the truth to elicit a reaction because they are only limited by human imagination while the truth is limited obviously by the truth.
As long as humans have the free will to engage with whatever content they choose, there is a natural human reaction to engage with lies. Therefore, the idea that truth eventually wins out over lies is inherently flawed. It isn't the fault of capitalism or algorithms. The problem is human nature doesn't respond properly to bad speech. We can't change human nature, so the only option left is stopping bad speech.
We haven't recently invented the concept of a lie, so the idea of "that's just how we are" rings hollow. People are being radicalised by qanon, incels, redpill, the alt-right, <insert the group you dislike> at an increased rate because they're being force fed this content on a daily basis, and they're being siloed into custom fit echochambers by algorithms optimising for their engagement.
Which brings us back to my original point. Algorithms are just a multiplier.
If you give people two history books, one with actual history and one with interesting conspiracy theories sprinkled throughout, most people are going to find the one with the conspiracies more interesting. Stopping people from photocopying the book will help slow the spread of the false information in the book, but it does nothing to stop the underlying problem that people are writing fake history books.
It's a multiplier sure, but this is a completely new issue we're having (at least on this scale). It's caused by social media, that's not really up for debate.
You cannot prevent lies or deception, that's not within the realm of what we're capable of. Don't let good be the enemy of perfect.
This isn't a new issue created by either social media or the internet. It is an innate problem with absolute free speech and has directly led to the US entering multiple wars over the last 125 years. It doesn't matter if the lies are disseminated through newspapers, cable TV, or Twitter. Mass media is just an amplification device. That has been true for centuries. The only new aspect is that the rate of amplification is increasing which leads to the underlying issue becoming even more problematic.
I would argue that the reason this is becoming a bigger problem across the world is because the US is exporting its ideas of free speech through the internet, tech companies, and social media in a way that wasn't done in prior generations of mass media.
I think we need to change the way we think about free speech. We already have laws that restrict free speech when that speech it defrauds or defames an individual. We need more general laws when it is society at large that is defrauded or defamed.
Would Chinas inability to contain conversations give you pause as to how achievable that goal is?
Would you be looking at criminalising lies and the spreading of them, even without knowing its a lie? If so, how many people would be prosecuted? If not, how will it help?
China is trying to control conversations as a means of controlling the population. That comes with its own problems that are unrelated to regulating speech.
>Would you be looking at criminalising lies and the spreading of them, even without knowing its a lie?
I don't know what you mean by this. We don't know whether a potentially defamatory statement is a lie or not until there is a trial. This would be no different. We don't need a Federal Truth Commission or anything. American society already seems satisfied letting a jury decide what is or is not a lie today.
So in effect you don't care about the issue and are content with how things are? I'm sorry, I'm not sure that I understand what stance you're taking. Misinformation is not an issue to you, would that be correct?
Requiring a trial to do anything about a lie on social media would mean that there would be no change. By the time the trial comes, the lie is old news and there is some new conspiracy taking grip.
That would be, essentially, meaningless. Not in any way, shape, or form preventative.
If we lower the speed limit of every road in the country, the average speed of cars would drop even if we didn't increase the number of people we ticket for speeding.
We should change our laws to allow for more restrictions on purposeful lies that damage society. We shouldn't allow ExxonMobil to spend decades researching carbon emissions while knowingly lying about the conclusions of that research to the public. They should face repercussions for that as well as the people who knowingly spread that misinformation. I think that would be more effective than what you seem to be proposing which appears to be either limitations on recommendation algorithms or ad supported content.
It sounds like the problem implicit in your framing then is "algorithms", not extremists--a stance I agree with--which would imply the disinformation isn't coming from big tech's users, it is caused by big tech itself, and maybe it is the algorithms that should be limited and regulated instead of the discussions.
I doubt this, because it was possible to 'go viral' to a limited degree way back in the days of Usenet, with any algorithmic selection or amplification going on at all. preferential attachment is a social phenomenon which businesses leverage for profit (not just int he era of social media, but since the establishment of brands and before that, the awareness by politicians and theatrical promoters of how compelling parasocial relationships can be). Algorithmic manipulation is a problem, but you could switch it all off in the morning and you would still have the same issues. Any system can and will be gamed.
Give it 3 days, and it will populate the myth that purple elephants are responsible for 9/11, because their pink mousy overlords demanded more moon-rock cheese with lizard-flavor.
People do a good enough job of making up their own nonsense about 9/11 being perpetrated by the usual suspects using space lasers, holograms, mininukes, etc. We don't, and shouldn't, censor any of that.
Cliches aren't an argument, but they're good for clicks. You can do better.
The solution is not as simple as 'more speech' because we are dealing with a historically unprecedented volume of amplified and recorded speech at scale, so things like signal:noise ratio become very important. Simplistic responses like yours suggest an unwillingness or inability to engage with the complexity of the issue.
A simple example of why you're wrong is that it's easy for an actor with access to an echo chamber to launch a viral cascade by modeling outrage over some made-up or minor issue to boost engagement. Fans of the actor share it out of agreement, amusement, or for the pleasure of owning their opponents, while opponents of the actor either deride or argue. The actor gets to appear in the day's trends, gain new followers/subscribers, boosting influence and financial intake.
But there's also a more subtle transfer of wealth, not so obviously measurable by the metrics of the social media platform. The actor spends 1-5 minutes composing their message, like you spend only a short period to select and repost your cliched response. The time that other people spend replying takes at least as long and often longer. So while the actor may have invested 5 minutes out of their 24 hour day in amplifying their own profile, by saying something deliberately controversial, they've caused others to put in significantly more time to responding.
Here, that's not such a big deal because HN discussions are not that complex and people have time to read and consider every comment in a thread they're interested in - so your claim has some validity. But on large social media platforms, the scope is radically different. A provocative tweet can generate thousands of negative replies, so that an investment of 5 minutes by the provocative actor can cause people to collectively put hundreds of hours into their responses, which unconscious social labor serves to amplify the provocateur. There are lots of influencers who have this down to a system and do it on an almost daily basis. It's essentially a theft of time - a little from their fans, a lot from their anti-fans, with that time being converted into engagement and increased reach.
This is why your reflex response is wrong. Bad speech can be designed to yield more speech that amplifies the original bad message for fun and profit. Your concept of reasoned debate only functions in forums where there are some leveling factors like equal time allocations or legal procedures for the allocation of time. Here on HN and on many small forums, that function is done by moderators (who can ban intentionally disruptive users or manage anti-spam filters) and to some extent by users (whose expressed preferences have direct weight like downvotes or flags, and indirect weight by their long-term community participation making them familiar to each other). On big platforms, the calculus is completely different and your model of reasoned debate breaks down.
the low s/n then makes it about who can be loudest and/or most provocative. i don’t disagree that disinformation shouldn’t just be squelched, but it’s not as simple as “education” or “more speech” and as humans we tend to prefer the path of lead resistance.
Dmytro Zolotukhemin at the Institute of Information Security in Ukraine has an interesting take on this which he talks about in detail on an interview with Silicon Curtain on YouTube. He asserts that the idea of disinformation is so slippery and hard to work with that it is not worth it. What we really have is stories that people share. His idea is that the right way to regulate this is to provide people with tools for assessing information themselves. Basic critical thinking and internet searches can go a long way to revealing both truth and ambiguity. This is a specific variation of the idea that the answer to bad information exchange is more and better information exchange.
The solution is obvious, it is called education and critical thinking. But the same kind of gov are also very afraid of a too educated population that could challenge their authority!
Because their customers/voters want it? Me, I very much prefer censored HN to free speech 4chan. The question is, what are you doing here if you prefer 4chan where the "truth" is not censored?
As someone who visits both HN and 4Chan, I like having the choice. I definitely don't want the government deciding that 4Chan is illegal and wiping it off the 'Net because it contains disinformation.
When you ask, "How should governments fight disinformation," what you're really asking is, "How can governments deny their citizens the right to choose what information they read, watch or listen to." When it's phrased like that, it's not so appealing, is it?
No, it is the other way round. If there are no places where companies and/or governments censor/limit free speech/fight misinformation, every place turns into 4chan and you lose your ability to choose. I'm happy there are places where speech is not restricted. And I'm even more happy that there are places where speech is restricted. Do you think HN should stop moderating discussion?
Do you believe every online forum accessible to Americans should have as strict, or stricter moderation than HN, because it's been demanded by a subset of voters?
There's a hidden assumption in your argument that the only way that forums like Hacker News, etc, can moderate is if the government allows it. Else why would you mention governments and private forums in the same sentence?
I support private forums being able to moderate. I oppose governments having that same power.
Nonsense, if you tolerate disinformation you get more disinformation. Countries that fail to address it will fall and the world will be worse off because of it. The right to free speech does not include the right to a free platform or free promotion.
It does include the right to access the public square though. And if a particular free platform has become a public square by virtue of size then I think it does include the right to access that platform.
Such platforms should probably not be permitted to promote in a general and unrestricted sense the content of particular speakers that make use of them. That's no better than clandestine government propaganda. An obvious wolf in sheep's clothing.
I hear this question posed a lot in this context, but we already done this now don't we? Someone (or ones) decides which laws are valid, which educational material to teach our kids etc. Are those not ok because some person or body of people have decided for the rest of society?
This narrative has killed many republicans who feared getting vaccinated, and it will only kill more. One could argue that it is their own actions, but I am not keen on blaming people for their circumstances.
The answer is we decide what is disinformation through our institutions and the value we place on them. Your and I's opinion on certain matters should never weigh remotely as much as an expert's on a particular topic unless we are such experts.
We ought to measure the value of one's speech through the scrutiny it has to go through to be published at trusted channels.
Agreed, disinformation is dangerous and it should be treated as such. Just because you say/write it doesn't mean I'm required to host it. Freedom to promote stupidity does not outweigh my right to unplug my microphone.
Can anybody elaborate why? I understand if it's financial crisis or some other gov fuckup. But this is a natural disaster on an unprecedented scale, government may not have enough resources to handle it. What's the game here?
So Erdogan himself came into power after the 1999 Earthquake devastating the country and wrecking the economy, leading to early elections, leading to his party winning landslide election in 2002.
That earthquake was like nothing seen before, hitting the most industrialised and populous part of the country and they had to introduce new taxes like Luxury consumption tax to fix the damage and prepare so that this never happen again.
This luxury tax is applied pretty much on anything and that's why an iPhone 14 Pro cost 2300$ in Turkey(US price is 1000$). Turks also pay 2x to 5x the price of the price in EU or US on purchasing automobiles, so it's a big deal. They also have lot's of safeguards so people just don't buy from abroad, therefore everyone chips in.
Also, Erdogan's primary achievement is construction business. He made sure that the builders can build and profit hugely, they also destroyed a lot of nature in the process and fine tuned the economy to serve the construction sector through the years. Guess what was the first Covid measures Erdogan announced at the start of the pandemic? He announced credit facilities for purchasing homes, no joke.
Also, the new buildings were supposed to be built according to the new code which is quake resilient and it was one of the prime motivations of this construction based economy.
Fast forward 20 years and we were struck with another huge earthquake.
The initial response is very weak, in contrast to the 1999 quake because in that one the Turkish Army was summoned immediately but this time for some reason Erdogan doesn't request assistance from the army and the protocol allowing local governments requesting assistance was removed by Erdogan in the previous years.
A full day passes without anything to show as a response when people are sending selfies from under the rubble with their address, seeking help. Most places receive no help whatsoever and people go into a winter night under the rubble or on the streets without electricity, food or water.
Videos showing newly build homes collapsed, people find promotional materials claiming quake resistance and the new buildings are collapsing like the old ones
The luxury tax, AKA the earthquake tax collected wasn't used to be ready for the next quake. This was something widely known and criticised but now we have videos of newly built buildings collapsed. It materialised.
Anger builds up, Erdogan is no where to be seen and the ministers are giving ridiculous speeches about how everything is under control and use a language crafted for the upcoming elections.
Then Erdogan appears on the TVs giving the speech you can watch.
So yes, he fucked it up for about 20 years and fucked it up on the disaster night. He used to be huge critic of the way the politician he replaced handled the quake and he ended up doing exactly the same, even worse. If the irony has not set in yet, let me tell you that Turkey is in economical crisis for some years now and Erdogan is in a coalition with the guy who was in a coalition with the guy who he replaced. It's almost poetic.
Erdoğan also gained immensely from the economic measures taken after the 1999 earthquake and the following economic depression. He eventually rolled back a significant portion of them, and further wrecked the economy by lowering interests in the face of inflation, and multiple times: https://www.youtube.com/watch?v=dkyudz4w5Gc
> the earthquake tax collected wasn't used to be ready for the next quake. This was something widely known and criticized
It was openly admitted by a minister that it was spent on building roads. A lot of those roads have now giant cracks on them, because of the earthquake.
> ministers are giving ridiculous speeches about how everything is under control
One even said that the rescue operations are going slowly to make sure people under the rubble aren't crushed.
It's ironic that you are spreading disinformation about an app that is used to report disinformation. The app was released about 2 weeks ago. Not 24 hours after the earthquake. The feature was also already within the app.
Yet another example of why inculcating the virtues of free speech and limited government into a culture is so important. It is too easy for those in power to infringe on individual speech rights by labeling something as "disinformation" or "misinformation" as their justification.
Erdogan and his ilk single-handedly made me hesitant about donating, that regardless who I donate to, they’ll find a way to line their pockets. I ended up donating, do the same, but Erdogan is a bastard man.
And there is what, precisely, to prevent Erdogan and his cronies from just seizing the funds and arresting the principals? These bank accounts are in Turkey.
I would honestly be more comfortable using the likes of PayPal, as it’s less likely to be stolen by the government.
They are very quick in spending the money and don't do long term project, built a solid reputation over the years and it's founder is a Turkish Rocks star who is politically not very pronounced.
That’s a good strategy, and good to know, thank you. I worked with an orphanage in Kyrgyzstan who had a policy of ensuring their bank account was empty at the end of every day, and it worked for them.
I’ve made a donation - I hope it ends up helping someone who needs it.
Well, if we go by the logic you stated, a meteorite may fall down to my head now while I am exiting my work place.
Anyway, I think there are enough alternatives to donate. Where I live, a local relief organization is collecting donations for Turkey and Syria. Then there are UN agencies who do that etc.
Surprisingly, I upvoted for teaching me that the earthquake bomb was actually a thing and used in the wild:
> It was used to disable the V2 launch sites at La Coupole and Blockhaus d'Éperlecques, put out of action the V-3 cannon sites at Fortress of Mimoyecques, sink the battleship Tirpitz and damage the U-boats' protective pens at St. Nazaire, as well as to attack many other targets which had been impossible to damage before. One of the most spectacular attacks was shortly after D-Day, when the Tallboy was used to prevent German tank reinforcements from moving by train. Rather than blow up the tracks – which would have been repaired in a day or so – the bombs were targeted on a tunnel near Saumur which carried the line under a mountain. Twenty-five Lancasters dropped the first Tallboys on the mountain, penetrating straight through the rock, and one of them exploded in the tunnel below. As a result, the entire rail line remained unusable until the end of the war.
Yes, but people will connect this factual bomb to something magic which can cause a region wide earthquakes. Undetected, of course, masquerading as a natural event.
Can you explain what makes this account a sock-puppet account, or how they're causing trouble? The comment makes valid points, and provides useful information.
> Twitter was heavily used to seek help by people in turmoil, even hundreds of people were posting from under the rubbles. The officials were claiming that everything was under control and they are helping everyone but people were posting videos showing the situation on the ground and the situation didn't look even close to being under control.
Social media is the first target no matter the country
The US do the exact same [1], they control them with CIA agents [1*], so it is easier for them to do profiling rather than blocking it
In fact, it's the first thing the US want to do for foreign apps, to ban them, just like with TikTok [2], they are recent talks about a global ban too
So we can't just throw the stone at Turkey, you have to examinate the situation, they had a terrorist attack in Istanbul in november [3], mossad agents doing shady things [4], wich btw gives flashbacks of the failed coup by the mossad [5] (imagine if the coup succeeded knowing how the Ukraine-Russia conflict developped and the current issues in Azerbaijan-Armenia, they dodged something sinister)
Twitter is known to be a place with lot of political activity, it's easy for a foreign country to spread misinformation, there are lot of noise
And Twitter is not the most popular social media app in Turkey, plus the population in that region is not tech savvy either
You had me nodding until this part. Yes, twitter is filled with misinformation. But government censorship is more dangerous than misinformation. Governments have killed more people than anything else short of heart attacks and cancer. Governments murdered hundreds of millions of people in the 20th century; giving governments the power to censor whatever the government deems to be misinformation is insanely dangerous. The most dangerous kind of misinformation is that which is promulgated by the government itself.
It's one of the mission of the CIA, spread propaganda, use influence to manipulate the press and journalists all over the world etc, back during the cold war
Using social media to achieve this mission is the natural evolution, as peoples usage to get information changes over time
It's very well documented, so seeing them at key roles isn't a surprise, and shouldn't be a surprise to anyone
They can't do it with TikTok, the app of choice of the youth and people all over the world
Good god - the FBI recently got outed (Twitter files - Elon Musk - hello?) paying more than $30M to Twitter to do their bidding. Are you really this ignorant of current events?
It's pretty clear we all these days live in different bubbles, of what is "widely known current events".
Googling, it looks like it was $3 million, not $30 million, for data requested with court order for such, rather than for suppressing speech or other control? It was still the FBI, not the CIA? This one?
Yes, I legitimately didn't know what the GP I was replying to was suggesting that the CIA controlled, what "them" meant. Now that they replied, I understand what they are suggesting, and think they are... living in a different bubble than me.
The wiki mentions the CIA, but their investigations found links to groups helped by both the mossad and the CIA, i'll try to find the link for the investigation, i'll edit the post once i find it
Your source links don't match up with what you're saying. For example, with [1#] you claim the US "controls" social media with CIA agents, but [1#] simply shows a social media company that merely hired former CIA agents. That's like saying Facebook hired a bunch of ex-Microsoft people, therefore Microsoft controls Facebook. Doesn't make sense.
Because when western leaders want to censor information, they claim said information is a "threat to democracy", and everyone just trusts them. (not that it would work - it seems like turkish people actually know how to protest)
Erdogan has for past the decade regularly arrested critical journalists and opposition politicians on very flimsy trumped up charges.
In America there is a massive and lucrative media industry dedicated to attacking whichever politician/party happens to be in power, alongside half of social media at any given time. It's pretty clear political freedom in the West is much stronger than in Turkey right now
> Also there were many incidents of the mainstream media cutting of talks or turning away the camera when people said or did anything discrediting the official narrative.
Twitter itself pioneered this during covid. It's no surprise others learned from it.
Just look at his face when delivering this speech: https://www.youtube.com/watch?v=doy38aKbMw4
It's like from V for Vendetta. The lighting choice is very particular.
Twitter was heavily used to seek help by people in turmoil, even hundreds of people were posting from under the rubbles. The officials were claiming that everything was under control and they are helping everyone but people were posting videos showing the situation on the ground and the situation didn't look even close to being under control.
The head of communications of Erdogan even introduced an app to streamline "reporting disinformation". Like, it's the second day after a massive earthquake and they published a f*king app to snitch people. Here is the announcement of the app, early at 05:00 local time in the morning 24 hours after the quake: https://twitter.com/fahrettinaltun/status/162277720485259264...
Also there were many incidents of the mainstream media cutting of talks or turning away the camera when people said or did anything discrediting the official narrative.
Yesterday, some people with prominent accounts who shared the tweets from the people in the region began reporting that they were taken into custody by the police. I guess the day has come quickly.