Hacker News new | past | comments | ask | show | jobs | submit login
Right or left, you should be worried about big tech censorship (eff.org)
659 points by DiabloD3 on July 18, 2021 | hide | past | favorite | 544 comments



Big tech censorship will not solved without addressing the other half of a problem with two sides.

Without systems that promote quality, non-ideological, nuanced, original, tolerant dialogue we will keep running into the false dilemma of unmoderated cess pools or heavy handed moderation.

If we are to avoid this dilemma we need systems that organically promote constructive, tolerant and inclusive communication, and original, nuanced, intelligent, non-ideological ideas, instead of the opposite.

This cannot be done by organizations whose profit incentive is aligned with user attention instead of user satisfaction.

Those organizations benefit from degraded communication until strong moderation is both continually demanded and decried.

---

On a side note, I am so disappointed in the constant framing of ideas as right or left. It dumbs society down incredibly to view so much through the filter of two (often arbitrarily correlated, historically shifting) clusters of supposedly disjoint options.

In the real world, communes work well sometimes, markets often work better at scale but not for everything, markets require "social solutions" like non-commercial criminal and civil justice systems, education and health could be explicitly tested and then managed as investments with a pay off of lower taxes (including for the rich) instead of as charities forced on the rich, etc.

The Right/Left and associated ideologies are all self-defeating when believed in isolation and the nuances of their ragged edges, their limitations, and constructive combinations ignored.


Agreed. I think it's a major issue in political discourse when people begin to identify with labels and slogans, and treat them as prescriptive (ideals to be adhered to) rather than descriptive (approximate reflections of one's current general beliefs).

Surveys consistently show that most of us agree on far more than we disagree when questions are phrased in politically neutral terms. People quickly realize when they actually talk to other people that most are fairly reasonable and relatable, even if they have good faith disagreements on particular points, but social media being optimized for maximum engagement with shallow/short-form discussion just results in seeing the most extreme elements of society talk over each other and largely fail to come to a common understanding.


I put this to a test over a couple years of business travel.

While bringing up a field service group, I had a period of high travel taking me all over the US. I enjoy politics, culture and people and am frequently frustrated with the left - right, very tribal political discourse myself.

So, what I did was find coffee shops and any other events where people might gather and enjoy conversation and had a bunch of chats avoiding labels while also conveying a strong, no judgement, just interested in perspective from different parts of the nation...

Those looked kind of like a two way interview.

Many of those ended up being long, gratifying conversations too.

Based on these, I would submit the following:

People fear judgement.

People fear "that other tribe"

An awful lot of discourse is fear, blame and shame.

eg: teaching "those other people" a lesson

How could anyone tolerate "that other tribe"

You get the idea.

Outside of those conversations, which all told me the US has poor class awareness and with that, equally poor ability to see others as just peers, instead it is all colored in various ways that amplify differences despite the reality being we are all people who run the same basic way, have the same basic needs, struggles.


I associate coffee shops with certain kinds of people. Not universally, of course, but enough to wonder if you were seeing a restricted set of people with whom you were better able to hold those conversations.

You mention other events, so perhaps I'm totally wrong here. But I'm curious about the ways that the events you picked may have shaped the conclusions that you drew.


Totally fair.

No, it was in line at the market, hanging in the park, the driver on a longer drive in a cab, Lyft, janitor, trades people...

What I did was just be opportunistic. Tried to fill some of the time I had learning more about how to just talk with people and have them feel they can do the same.

I had biases, mostly targeting normies and people who looked like they may have a story, rough or not.

What I can also tell you is I have done this sort of thing in my past. It is a simple pleasure of business travel.

What I did differently was avoid signaling "my camp", and tried to signal seeking more than anything else. If they were inclined to talk at all, I was frank and said I want perspective outside my familiar circles, so what is your take on... and go from there.

Sometimes I did not offer mine at all, content to let them share what they thought a curious person from somewhere else might want to know.

The primary thing I was interested in was arriving at a comfortable dialog state, like when you know it won't come back to bite, or be on the news, that sort of thing.

And the things I put above inhibit reaching that state.


Was there anything surprising to you? Any viewpoints more common than you previously thought?


Yes!

This one surprised me the most, but people are just tired. Like fried.

Many put a brave face on it, but could use a break. If the chat advanced to where one could really relax and just talk, being tired always came up. And many did not express sadness, just fatigue, sometimes frustration.

"Does it all have to be this hard?"

In my past, I do not recall that expressed as much and from more diverse socioeconomic positions. This sentiment may be majority held now. Just a gut take on arguably small and unscientific samples.

"The Reset" The idea of it all not being sustainable. If it went here, people would roll through examples that concerned them. Some would make connections, but the general feeling of impending doom was far more common than I expected. I am sure more people than we know do not believe things will work out.

And one last disturbing one is along the lines of trading safety for liberty. Tons of people could give two shits about the increased surveillance. More people than I expected harbor and will express basic fear when comfortable enough to share it.


"Without systems that promote quality, non-ideological, nuanced, original, tolerant dialogue we will keep running into the false dilemma of unmoderated cess pools or heavy handed moderation." - it is impossible to measure the qualifiers "quality", "nuanced", "tolerant", "original", "non-ideological" while still being objective. Thus, whatever implementation for enforcing these qualifiers you choose, it will inevitably lean towards some ideology. Hence, contradiction.


What you wrote sounds to me a bit like claiming that it's impossible to qualify paintings as beautiful or not, so therefore there cannot be any art museums


Well, objectively, you can't have an art museum that has only beautiful art in it, since beauty in art is subjective and non quantifiable, so yes there can not be an art museum that has only beautiful paintings, and the beautiful paintings that supposedly are there might only beautiful to the eye of the curator.

Either way, the point is not wether or not you can have a place with discourse with the all good qualities described, it is more so that because those qualities are not quantifiable, it is impossible to know for sure.


It’s not about the content it’s about the incentives and reward structures we put in place to govern our society. E.g. we want(ed) a strong economy of businesses in the US so we subsidize(d) their operation. We know very well now that social media platforms (sometimes even deliberately) promote content that irks people because they get rewarded by our ad industry for eyeball time. What better way to consume and waste human capital than to present a bunch of people with outrageous bullshit? There must be a structure or limit we can impose on the attention industry that would deter what we have come to consider abusive treatment of humans. A really dumb but raw implementation would be a use tax where social media companies are literally taxed for engagement beyond some threshold. Point is it’s not a contradiction to demand and invent better systems in the face of poor outcomes. What you can’t do, and what is the contradiction you’re referring to, is to force the desired outcomes without addressing the underlying incentives because then you are oppressing people or companies exhibiting valid behavior within the existing system.


I disagree.

It is not impossible to measure quality, once you accept that quality has many definitions and therefore requires many contexts incentivizing different versions of it for different purposes and audiances.

And that diversity would be a very good thing.


Ha, I was going to add basically the same sidenote, even calling it a sidenote and using the word framing. All I was going to state was that the title could have been "right, left or center" but instead he chose "left or right". WTF.

We could almost imagine the author himself is vying for attention by pushing the hot buttons known to work. Either that or he is incapable of recognising his own foibles.


I like your comment and think it could potentially approach the broader problem in a roundabout way, but it does feel like a separate thing. But ignoring that, I run a forum and so think a lot about ways of encouraging decent conversation. I think Reddit mostly does pretty well in that (in my experience) the junk is hidden and requires effort to read. Hacker News pushes it down the page, but it's trivial to scroll down and see who's in the naughty corner out of curiosity.

I mostly read r/NBA and find the quality very decent. I'm sure there are trolls and racist rants and so on, but unless I expand comment branches, I don't see any of them. I find that the branches being pruned/hidden helps manage how much time I spend on a discussion too. On HN, I often feel like I want to read or skim every comment in a discussion which is rarely a great use of time.


> Without systems that promote quality, non-ideological, nuanced, original, tolerant dialogue we will keep running into the false dilemma of unmoderated cess pools or heavy handed moderation.

I disagree tying big tech censorship to this.

Yes what you describe is nice and what the audience here (including me) prefer, that's why we're here. Such places with such rules are paradises.

However, I for one accept the risk of freedom over "safety". I acknowledge we humans are "good" and "evil" at the same time, maybe we'll "actualize" at some point in the future as a whole (I doubt it) but we're not there yet. Even if that's a goal, I strongly believe allowing cancel-candidate speech is vital to the progress. Producing questionable speech has its cost already, I actually think the cost is too high already given the mob nature of social media.

Yeah there are "misinformed" people, but isn't that the value of democracy? We disagree but we compromise.

Even for the anti-vaxx case, when the ones who refuse to vaccinate can screw up the whole population, I still don't think you should ban the speech, the law or executive order should be based on action instead.

Even for the GAB case, I don't think the platform should be banned, nor private conversation being snooped on. So how do we protect ourselves against such attacks? I'd say these are just extra protections that I don't want given the tradeoff, the law is there to punish the actions. I'll take that safety risk that's been there since before tech is a thing any day.

Guess the nuance here is the right to speech versus the right to publish given the nature of social media moving the forth pillar of democracy to the public. My take is that the same applies even for publishing. If nonsense sounds more attractive than scientific facts, maybe we educated people should work on our PR skills, or fix the root of the problem, education. Now that's progress.


There is more to the problem than a tradeoff between freedom and safety.

The problem isn't created simply by allowing people the freedom to express themselves. It's created by highly manipulative middlemen choosing what to promote and what to suppress (in a relative not absolute sense) via algorithms designed to maximize attention to ads.

That is nothing like a simple free speech situation.

I have no problem with free speech, and free speech forums, even if they do sometimes become cess pools.

I have a real problem with the most widely used glorified scrapbook sharing software (Facebook), biasing all our feeds toward content that is likely to pull us into comment flame wars or using provoked anger and tribal identity reinforcement as a user addiction strategies.

Note that Facebook didn't start behaving this way until after the network effect of their social graph had created tremendous lock-in/switching-costs for their users.


Exactly ... We need to be "liberal" in some things and "conservative" for others.


And dispose of those terms completely, except for exceptionally rarified discussions where attention is paid to very clear definitions of which versions of "liberal" and "conservative" are even being discussed.

Effective policy isn't found using that false Rorschach compass.


Right and Left are basically ideologies carrying to the different ideas of society depending upon gender. On average for females a more redistributing, moderating society is more attractive. For men a society that allows them to display "provider" status seems more attractive. To frame the whole thing as a ideological debate, in a society with massive change of roles and influence is intellectual dishonest.

Some transcend these "stereotypes" and are for a more redistribution society based upon the insight that these transformations require a reform of policy. They are sadly in the minority - in both ideologies.


Bro what the hell are you on about? Have you read anything on Left v Right wing ideologies? What?

Left v Right is about proletariat vs bourgeoise and it originates from French politic during the revolution where the left side of the assembly favored moving power towards the people and the right were Monarchists that wanted King Louis' continued reign.

To equate this to gender is unbelievably ignorant bordering on intellectual dishonesty.


This has to be one of the strangest takes I've come across...


This all sounds very nice, but there's a bit of sleight of hand here since the article goes out of its way to avoid details. That way everyone reads into exactly what they want, and it's totally non-obvious whether this is actually a workable proposal or not.

But what's the concrete proposal here? How would things actually work? Will the services be forced to accept whatever unmoderated feed of filth the linked to service sends their way? If not, how is this supposed to fix the moderation problem?

How are you supposed to link together applications with totally different data, interaction, and identity models? Obviously you couldn't link e.g. say Discord to Instagram and have the interactions between the systems make any sense. Are we just going to have a predefined ontology of possible social networkign apps, populated with the currently existing models, and define an API for all of then? When and how does that ontology get redefined? Or define a single lowest common denominator covering everything? What happens with features that don't fit into the models? They're entirely forbidden? They need to go through a multi-year public review bureacracy?

Who exactly will be forced to interoperate, and who gets a free pass?


> Will the services be forced to accept whatever unmoderated feed of filth the linked to service sends their way?

I was kinda wondering that. If a user gets banned, and moves off-platform to continue the behaviour there, but is somehow magically still linked to everyone they were linked with before... has anything really changed? Will facebook be forced to show its users the same nastiness it just booted off, because of the interop necessities?


> Will facebook be forced to show its users the same nastiness it just booted off, because of the interop necessities?

Facebook should be forced to show me the feeds I choose to subscribe to. I'm perfectly capable of deciding for myself whether I find something nasty, and I find the idea of Zuckerberg as an arbiter of morality both absurd and dystopian.


> I find the idea of Zuckerberg as an arbiter of morality both absurd and dystopian

This is a very common characterization that I find frustratingly absurd. Zuckeberg is not "an arbiter of morality" in any respect, the contents of Facebook is a corporate business decision, not a reflection on morality.


If FB are blocking things that they deem bad, then they are clearly setting themselves up as arbiters of morality, even if they say they aren't.


Facebook has opinions about the type of content they'd prefer not to facilitate on the Facebook platform, that doesn't make them "arbiters of morality", it just makes them curators of their product identity.


So they might say "we don't want $X type of content on our platform, as it would negatively affect our product identity".

But that's in fact a moral judgement! It's saying "some people think $X is bad, and we agree with them (or at least choose to pretend to so as not to soil our image)".


I would disagree that it is a judgement based on morality. It is a judgement made by the logic of corporations, which are machines made up of people that take on a life of their own.

While the actions of organizations like this have moral consequences, and the people that are part of it have their own morals, the machinery itself does not use morality to make decisions, except in hindsight as justification.

The moral consequences may be beneficial or harmful; it is all the same to a profit machine that acts in its own interests, which transcend the people who it is made of in the form of workers and users.


Saying that corporations should act to maximise their profits is in fact also a moral judgement.


I never put a value judgement like should on it. It just seems to me to be the dynamic that happens with organizations. They take on a life of their own, independent of the people they are composed of.


It's not one or the other, they can both care about their product identity, but by the same stroke arbitrate morality by doing so. (Product identity takes popular morality as an input).


By that definition, they're no more the arbitrator of morality than anyone else is by doing this


What do you mean? Not anyone has the power they do


The point is that if they're simply responding to consumer demand, then it's collectively us who are the censors.


If we were the censor, we would have the government do it (which represents us). Facebook's opinion of what they perceive to be morally dubious does not represent anyone but facebook themselves.


> If we were the censor, we would have the government do it (which represents us).

Well sort of. We can't legally, right? The constitution prevents that. If Facebook is simply censoring the things that consumers ask it to, it is absolutely serving itself (in a profit driven way), but it isn't asserting Facebook's morals. It's asserting something more like "American consumer morals".


Facebook clients are not individual, they're ad-centric. Facebook is serving what it believes to be advertisers' interest, which may not align with most people.

Plus, the reason we can't legally is because we agree it shouldn't be done.


> Facebook clients are not individual, they're ad-centric. Facebook is serving what it believes to be advertisers' interest, which may not align with most people.

But this still ultimately consumer morals, since it's about content that advertisers don't want to be associated with because that content will reflect badly on them in they eyes of the consumer.

> Plus, the reason we can't legally is because we agree it shouldn't be done.

We (generally) agree that the government shouldn't engage in censorship, yes. The claim that individuals and groups should not be able to themselves moderate the stuff that appears on their websites is a much more controversial claim.


> the contents of Facebook is a corporate business decision, not a reflection on morality.

Are you suggesting business decisions are exempt from having moral consequences? Are the decision makers somehow less culpable as long as it is "just business"?


Of course business decisions have moral consequences. Does this imply that all business leaders are arbiters of morality, or is it only Zuckerberg--who is just making business decisions like the rest--who is an arbiter of morality?

The answer is, he's not. He's just a greedy person with a powerful company. There are no real moral decisions to be found here, hyperbole notwithstanding.


> Does this imply that all business leaders are arbiters of morality, or is it only Zuckerberg--who is just making business decisions like the rest--who is an arbiter of morality?

Zuckerberg and other social media CEOs are the only ones using moral language to justify censorship, hence the irony of Zuckerberg as moral arbiter


If yellow journalism can stampede a nation into war (e.g. "Remember the Maine") through presenting a slanted view of the situation, well, Facebook and however it chooses to editorialize what it allows to be presented to its users has a vastly bigger audience than the newspapers of old. There's nothing "absurd" about that.


If future, as social media gets more powerful, they will be able to determine who gets elected. No-one will be able to get elected if they go against them.

Once that happens, Google and Facebook will never have to pay their share of taxes ever again!


I think this gets you into really weird places really quickly.

How do you not end up with a censorship department of the government, that is tasked with writing moderation rules? It probably wouldn't be called a censorship office, but that's what it would be: a government office determining what can and can't be posted on social media. This seems entirely antithetical to the idea of the first amendment.

Do you have any other way to implement your idea of forcing Facebook to show you the feeds you subscribe to?

EDIT: Just saw in your profile that you might not be from the USA - feel free to substitute "the idea of free speech in modern liberal democracies" for "the idea of the first amendment" as needed. :)


> How do you not end up with a censorship department of the government, that is tasked with writing moderation rules?

That already exists: governments already have laws limiting some forms of speech e.g. copyright infringements or pornography. Obviously corporations that want to stay in business have to obey the laws.

> "the idea of free speech in modern liberal democracies"

As I've already pointed out, all states already have laws limiting speech. Maybe I think there should be more such laws; maybe I think there should be less; but either way it's orthogonal to whether I want FB to have power to control what I can say and hear.

If my government passes laws that I don't like, citizens can vote them out at the next election. I can't vote Zuckerberg out, he's an undemocratic locus of power.

> It probably wouldn't be called a censorship office, but that's what it would be: a government office determining what can and can't be posted on social media.

As I said, such laws already exist.

> Do you have any other way to implement your idea of forcing Facebook to show you the feeds you subscribe to?

Yes, require all social media apps to provide interoperability though ActivityPub and similar protocols. Make sure there are no legal or technical impediments to people running social media servers (Mastodon etc) on their own hardware though their home internet connections.

What I actually want is a world where people don't care whether FB censors things, because it doesn't matter as people can and do use other means to communicate. So FB censors stuff and no-one notices because it has no practical effect. That is, a world where FB is stripped of their power.


> What I actually want is a world where people don't care whether FB censors things, because it doesn't matter as people can and do use other means to communicate.

Fortunately, we already live in this world! We have the internet!


Lol, that has the public square embodied in a few big systems.

If it were me, I would have those regulated to be like utilities. Pre internet, business was done via phone. Deplatforming = not allowed to have a phone, and that basically did not happen.


Deplatforming doesn't apply to phones because phone communication is fundamentally different from social media.

And prior to the internet, deplatforming still existed: when would you ever hear anyone make neo-nazi arguments on any of the broadcast tv networks?


Well, broadcast has been gatekeeping on a LOT of fronts for a very long time, nothing new there.

When do we see economic reporting from the labor point of view, for example? We don't on broadcast anymore. And after massive media consolidation, that reporting only happens via indie media, and it's largely via the Internet, discoverable through social media.

The answer to broadcast issues was public access broadcasting, and the likes of PBS. That sort of worked, except for underfunding and the need for big business to underwrite most of the programming.

In terms of social media, what is needed is process. Trying to do business today without it is similar to not having a phone. However, let's set that aside and just look at conflict of interest and speech issues.

The major players are the public square today. Whether that was intended isn't the point. That they are, is.

Back when Alex Jones was deplatformed, a lot of us, myself included, said we were headed down this increasingly difficult road today, and here we are. We were also frequently judged as being some supporter for Jones. (I am not, but see his case as being high value for this discussion)

Process needs to be there to manage expectations, provide opportunity for improvement, provide consistency and all the basics needed. In other words, some growing up needs to happen. Costs are higher, risks are higher, and the money is more than good enough.

Without it, what we've got is a very highly arbitrary, quite easily abused system that far too many people depend on for important discussions. Look at the folks advocating for direct democracy. They assume all this is here, working properly, and working for us, not against us.

The neo-nazis were on public access BTW. I seen 'em, along with quite a few others on the mandated access channels.


> How do you not end up with a censorship department of the government

That department of censorship will exist no matter what. Internally, it is called the trust and safety department, at most big companies.

The only question is do you want to have the decisions of that department to be limited by judges and rights that are laid out in the constitution, or do you want it to be an unaccountable free for all, where they can do whatever they want?

> Do you have any other way to implement your idea of forcing Facebook to show you the feeds you subscribe to?

Well, phone companies are currently forced to allow most people on their network. We could do it the same way, that phone companies are required to do so.


> The only question is do you want… or do you want…

Therein the problem lies. Users have no choice over which moderation policies their feeds are curated by at all. I should be able to choose moderation that I deem acceptable. To continue your analogy to telcos, this is akin to robocall blocking services which are a user decision and often tell you when blocking occurs.


> rights that are laid out in the constitution

Just to be clear, under US law, what you are describing (creating a government censorship department that determines how social media companies are allowed to moderate) would have NOTHING to do with the rights laid out in the constitution - it would in fact need to overcome the companies' constitutional rights to publish whatever they wish, just to exist.

You can argue that it's a good idea anyway, but claiming that government-mandated moderation is somehow enforcing constitutional rights is way, way off base.


> Just to be clear, under US law, what you are describing (creating a government censorship department that determines how social media companies are allowed to moderate

Yes it could.

The point is that if the government is doing the "censorsing" or whatever you want to call it, then at the very least, whatever that "censoring" is, we have judges and rights and similar checks and balances that prevents the government from doing too much.

If something is so bad, that it needs to be censored, then our court system should handle it.


The legislatures are already the "censors" and the executive the once enforcing "censorship". And there is an "appeal" process to some degree in the judiciary branch. The legislatures already write the "moderation rules" for entire societies.

Unlike the big companies who are only beholden to their share holders - and sometimes only really that one shareholder called Mark - the legislatures are beholden at least in theory to the citizens (who vote) in our democracies and at least in theory are curtailed by things like the US Constitution and Bill of Rights and equivalent constitutional documents and human rights legislation in other places.

Is leaving it to legislatures to make the rules a perfect solution? Certainly far from perfect, in my humble opinion, at least in practice with all the lobbying, corruption and self-interest going on. But it still beats leaving such extremely significant decisions of what can and cannot be said to the rulers of some big corporations.

Facebook, instagram, twitter, youtube, etc will also never get tired to tell everybody that they are not publishers and do not "edit" or "curate" information on their "open" platforms and thus fall under 230 and are not liable. All the while they constantly curate and edit and purge.


Having the government writing censorship rules sounds absolutely dystopian to me.

There are plenty of ways to get information on the internet - just because Facebook says "no porn" doesn't mean you can't get porn on the internet, you know?

But there's only one government (per country, of course). If your government says no porn... What then?


> Having the government writing censorship rules sounds absolutely dystopian to me.

No, because our court system could prevent any censorship that infringed on our rights.

It is better to have our court system stop people from censoring too much, than it is for there to be zero rules at all, as to what could or could not be censored.


> No, because our court system could prevent any censorship that infringed on our rights

No they can't. The legislature would. Do you trust Nancy pelosi and Ted Cruz to write censorship rules you agree with?


> No they can't. The legislature would

Yes the courts would. Lol. Any laws that infringe on our rights, can be struct down by the court system.

Thats literally the point of the court system. To strike down laws that infringe on our rights.

> Do you trust Nancy pelosi and Ted Cruz to write censorship rules you agree with?

Hey, at least if they write them, the courts can strike them down, if they infringe on our rights.

At least there is some check and balance there, where our court system, will strike down laws that infringe on our rights.

I'd rather have that then no court system being able to strike any of it down.

If you want to suggest that the court system should be able to strike down FB censorship rules, that would work as well though!


You're simultaneously advocating for a sweeping change in how the courts interpret our rights, while proclaiming that they will continue to protect them.

Currently, the courts have an opinion on censorship that is extreme enough that it protects what Facebook is doing. Today, the government cannot write censorship rules. They all, to use your words, "infringe on our rights". So if you want the government to be able to do that, and the courts to allow some censorship, you have to amend the constitution, and change the recognized legal framework of what our rights even are.

That (broadly) falls to lawmakers, not the courts. Do you trust Pelosi and Cruz (and other lawmakers) to change the very framework of what our right to free speech is, while simultaneously allowing the courts to protect it?


> while proclaiming that they will continue to protect them.

Well, are rights aren't particularly protected, from the censorship that happens right now.

There are zero checks on balances on the censorship that already happens. At the very least it would be better if we had more courts involved, so as to make sure that those courts have some say over what censorship can happen or not.

> Today, the government cannot write censorship rules

Sure they can. They are already doing so. Go look at the laws that cover phone companies. The government already has laws, that prevents phone companies from engaging in certain acts of censorship, or whatever we want to call that, on the telephone network.

Those laws don't apply to other things at the moment. But they could, if we changed them, and it would be no bigger of an issue than how those existing laws, already apply to phone companies.

> you have to amend the constitution

No we won't. All we have to do is take our existing common carrier laws, and expand them.

Common carrier laws, as they apply to phone companies, are already constitutional.

> to change the very framework

There is no massive change to the framework of free speech, when we already have similar laws, that already apply to a similar thing, which is the phone network.

Just whatever is happening there, already, and which already does not infringe upon our rights, could be expanded to apply to other things, and no massive constitutional change would be needed to apply these existing laws to new things.


> Well, are rights aren't particularly protected, from the censorship that happens right now.

Given that I don't believe that it is fundamentally possible for any private entity to infringe on my right to free speech, I'm going to have to disagree with you here.

> No we won't. All we have to do is take our existing common carrier laws, and expand them.

The Cato institute appears to disagree with you that it's that simple[0]. It's not clear that applying common carrier status to Facebook or Google could be constitutional, because they do not and have never claimed to be neutral, so using the force of law to make that so would be an infringement on the companies' rights to speech and association.

[0]: https://www.cato.org/blog/are-social-media-companies-common-...


> they do not and have never claimed to be neutral, so using the force of law to make that so would be an infringement on the companies' rights to speech and association.

You are not being creative enough, with how we could apply these laws.

One big reason why phone companies are claiming to be neutral in the first place, is to get certain protections, that those laws give them.

They claim to be neutral, to get certain benefits.

Phone companies are claiming to be neutral, for a reason. Its not because they simply want to. It is because they are doing that, on purpose, because the law is strongly incentivizing them to do so.

A similar strategy could be used against Facebook. What can be done, is we take away certain protections, that we currently give them, and only give them back, in exchange for them following something similar to common carrier laws.

The most common thing that people bring up, of a right to take away from them, is section 230 protections.

When the smart people talk about repealing section 230 protections, it is not to actually remove those protections.

Instead, the strategy, is to make it so, if companies want to keep those protections, they then have to follow other requirements, by making those protections contingent on acting a certain way, such as by following common carrier laws.

That is the way to get around being unable to force Facebook to follow common carrier laws. You don't actually force them to do it. Instead, you make the alternative, extremely difficult, by making previously necessary protections, contingent on following new neutrality laws.


This doesn't make sense. If you repeal 230, Facebook doesn't gain anything from being a common carrier. Repealing 230 solves the problem you seem to be solving, which is that the companies can no longer do any form of moderation.

The problem there, of course, is that both the average citizen and the average lawmaker want the companies to be able to do some forms of moderation. Which brings us back to the original question: Do you trust Pelosi and Cruz to define a reasonable framework for internet content moderation?


> If you repeal 230, Facebook doesn't gain anything from being a common carrier

Yes they do. They would not be able to be sued, if they were a common carrier, and in return, they would be unable to engage in many forms of censorship.

If they did not have section 230 protections, then them existing in their current form, would be almost impossible, and therefore, they would have to make the choice to receive the alternative protections, which are common carrier protections.

Common carrier protections are different from section 230 protections. If they are put in a position, where their only real option is to go under common carrier laws, then that would be the point.

> the average lawmaker want the companies to be able to do some forms of moderation

Common carriers are already allowed to do some forms of moderation/censorship. It is simply much much more limited.

And additional forms of moderation, could simply be handled over to the user, to choose what they want.

For example, it would be completely allowed for a customer, to have some automated calling blocklist, where they choose to block all adult content phonelines, that call them.

> Which brings us back to the original question

No it does not bring us back to that. Because common carriers are already allowed to do very limited forms of moderation, and that is what it would be a good thing to incentive facebook into having to follow.

We don't have to have lawmakers do anything except for get us into a position where facebook is strongly incentivized to choose to be under common carrier protections, where they will then be extremely limited in the types of censorship that common carriers are allowed limited to.


> Yes they do. They would not be able to be sued, if they were a common carrier, and they would be unable to engage in many forms of censorship.

If facebook doesn't moderate, they can't be sued anyway. Without section 230, moderation makes them liable, but if they choose not to moderate, they aren't liable.

You're suggesting that by opting into a common carrier status, they gain what?

> Common carriers are already allowed to do some forms of moderation/censorship. It is simply much more limited.

Such as? As far as I can tell, they're allowed to moderate forms of pornographic and harassing content which, to be clear, a non-common carrier would be legally required to remove anyway. The statues you're referencing basically say that a common carrier still has to do the moderation that everyone else is legally required to do.

> No it does not bring us back to that. Because common carriers are already allowed to do very limited forms of moderation, and that is what it would be a good thing to incentive facebook into having to follow.

Except, you're

1. Mistaken about the status of the moderation common carriers can do, as I explained just above and

2. Mistaken about the amount of moderation that people want. Both lawmakers and the average citizen want more moderation than a common carrier is allowed to engage in, which brings us once more back to my question: do you trust Cruz and Pelosi to develop a moderation framework?

I already know the answer. You don't. What you appear to want is a form of no-moderation absolutism that forces these companies to promote all speech that isn't harassment or pornography.

The problem with that is that, from a profit perspective, most users of those systems (and transitively, most advertisers) don't want to be associated with that kind of content.

The practical result would be a mass exodus from these platforms to other, different, new forms of social media that avoid being classified as common carriers (or do things to avoid the broad section 230 issues, perhaps by being even more highly moderated and small-group oriented or totally private, like discord). Maybe that's a win for some people, but if your goal is popular unmoderated platforms, that's not what you'll get, because that's not what consumers want.


> You're suggesting that by opting into a common carrier status, they gain what?

It means that they can't be sued for content on their platform. If they do not have section 230 protections, every piece of content on the platform, they would be responsible for.

And if they then choose to be a common carrier, they also get that immunity from being sued. There are 2 different ways of recieving this immunity to lawsuits. And if you take away way 1, which is section 230, then their only option is to take way 2, which is by being a common carrier.

> but if they choose not to moderate, they aren't liable.

The point would be to strongly incentivize them to respect the user's choice of moderation, common carrier laws, are the way that people often talk about, but sure maybe there is a 3rd way that you have found to do this.

> Such as?

Such as moderation that is done at the request of the user, for that specific user only. The phone company, can blocks calls if that user wants it, and common carrier laws dont stop that.

If Facebook was only allowed to do moderation, at the request of that specific, individual user (or, of course, things that are already illegal.), then that solves most of these problems, where users want moderation for themselves.

If we want to get more detailed here, we could imagine allowing categories of moderation, that facebook can engage in, if and only if the user opts into it. Then we don't have to define moderation, and can simply have it be whatever that specific user wants.

> the average citizen want more moderation

That can be done through incentivizing facebook to allow user choice, of their own moderation. Problem solved. Then they can get that moderation, and we don't have facebook forcing this moderation on others.

> What you appear to want is a form of no-moderation absolutism

Nope. I want the moderation to be left in the hands of the specific user. Therefore, everyone gets what they wants, except facebook I guess.

> most users of those systems

And yet, the phone system works perfectly fine. I don't particularly care if the phone companies are mad about being forced to follow certain laws. We wouldn't have to make laws, if they wanted to follow them.


> It means that they can't be sued for content on their platform. If they do not have section 230 protections, every piece of content on the platform, they would be responsible for.

No! See Cubby, Inc. v. CompuServe [0], which predates Section 230. If you do zero moderation, you are not liable for user generated content period. Section 230 creates a good-faith protection so that if you do choose to moderate some content, you don't become liable for failing to moderate the rest.

The legislative history of section 230 is literally that without it, companies were liable for user generated content only if they tried to moderate, and lawmakers wished to encourage companies to moderate content

> Such as moderation that is done at the request of the user, for that specific user only. The phone company, can blocks calls if that user wants it, and common carrier laws dont stop that.

This is not "moderation" in the common use. In the context of facebook, this would be that "I'm unable to block another user", which is silly (and perhaps should reveal to you why trying to shoehorn Facebook into a common carrier doesn't make sense).

> That can be done through incentivizing facebook to allow user choice, of their own moderation. Problem solved. Then they can get that moderation, and we don't have facebook forcing this moderation on others.

Facebook already allows users to block other users and pages. All of the options you want are already available. You're suggesting a strict reduction in moderation tools, and claiming that this will solve everything the way everyone wants.

So let me pose a simple question: how does your approach address coordinated misinformation campaigns? Say I come along tomorrow and start spreading lies about how and when voting works. I'm so effective at this that turnout next year is significantly lower because people are confused. I haven't done anything illegal. Can Facebook remove my content? It's actively and demonstrably harming the democratic process by preventing people from voting. What should we do?

> And yet, the phone system works perfectly fine.

I basically disagree here. Spam is a much larger problem on my phone than anywhere else. Plus, due to the implicit attributes of the phone system, things like spam and misinformation are handled by being crimes on the part of the caller. In other words, Ben Shapiro can post whatever he wants on Facebook, but if he were to call every American citizen and try to share the same information, he would be a crime. Not because of the information, but because sharing things widely via the phone system is essentially illegal.

[0]: https://en.wikipedia.org/wiki/Cubby,_Inc._v._CompuServe_Inc.


> . If you do zero moderation, you are not liable for user generated content period

Ok sure, whatever. But the point would be to put facebook in a position where they are only doing moderation that a user is requesting that they do. We can use common carrier laws, or we can use some different legal precedent. Thats that goal. The specific method, or process, isn't really the main point.

> you're suggesting a strict reduction in moderation tools, and claiming that this will solve everything the way everyone wants.

I am suggesting that users should be allowed to have voluntary choices about what moderation they want to have, as it relates to them, without facebook forcing those decisions on the user.

> All of the options you want are already available

No, users still are forced to accept facebooks choosen moderation preferences. That can't choose to opt out of Facebook's censorship or moderation, if they voluntarily choose to subscribe to certain content.

> solve everything the way everyone wants.

It will solve what a lot of people want. Sure, I am sure that there are people who are just explicitly pro-censorship, and want the 1st amendment to be removed, and want the government to ban all opposing political opinions.

But there are also a lot of people who don't want facebook to be forcing users to accept certain moderation, and instead want users to be in control of what they see.

> how does your approach address coordinated misinformation campaigns?

If something is so bad, that it needs to be censored, then that is what the government is for. Terrorist threats are already illegal, for example.

Backdoor censorship isn't the way to go. If you really believe that mis-information is so bad, that it needs to be stopped, by these types of methods, then the onus should be on you to gather enough support, to get around the protections of the 1st amendment.

Checks and balances are difficult to get around for a reason. And if you can't get around these checks and balances, well thats the point of those checks and balances.

> I haven't done anything illegal

Well thats life. You are trying to backdoor your way to try to get stuff censored, because you aren't able to convince the government to do it. We don't need this backdoor method of supporting censorship.

> Spam is a much larger problem on my phone than anywhere else

There is nothing illegal about phone companies offering additional, voluntary methods, of allowing users to choose more things to be moderated for themselves.

Such problems could be solved that way. By facebook, or whoever, offering voluntary programs, to block certain types or categories of things, if the user chooses to opt in to that form of moderation.

> Ben Shapiro can post whatever he wants on Facebook

You don't have to listen to Ben Sharpiro if you don't want to, in this situation. Instead, you could just voluntarily choose to subscribe to a "ban right wingers" block list, or whatever voluntary algorithm that the user opts in to.


> to get around the protections of the 1st amendment.

I have gotten around the protections of the first amendment though! You're objecting how I did, and saying that it's not fair or not right, and that I should instead get around them in a different way. But why am I supposed to trust that you won't also object to that other way when, or if, I do gain that support?

> But there are also a lot of people who don't want facebook to be forcing users to accept certain moderation, and instead want users to be in control of what they see.

They are: they're free to use a different service. No one is forcing you to use facebook or to distribute your content via facebook. There are plenty of other ways to distribute your content, and you have no more right to require facebook distribute your content then they have a right to force you to use their website.


> I have gotten around the protections of the first amendment though!

Yeah thats the point, lol.

If you are just pro censorship, and dont care about the principles that the 1st amendment protects, and just want to find a way around it, just say so.

If you would support the government, censoring all opposing political opinions, or jailing them, or worse, because of some legal technicality, just say so.

As long as you are honest about it, and you just want to do things that are basically equivalent to the government censoring anything for any reason, just say so, and say that you want to ignore all of this stuff, and want to find a way to censor all of your political opponents, in any way that you can find.

Yes, I completely agree that people such as yourself, don't care about any of these principles, and just want to figure out whatever ways that you can, to defacto censor anyone for any reason, regardless of the motivations in the first place, to stop censorship. Thats the point.

The whole point is that you don't care about any of this, and just want to engage in authoritarianism, in any way that you can get away with. Thats the problem.

Thats what everyone else is trying to fight against. Authoritarians who look for loopholes, to engage in authoritarianism, with whatever possible way that they can get away with.

Thats why we want laws to stop that!

> They are: they're free to use a different service.

But the alternative solution, that provides a lot of benefits, to most people, is to simply incentivize facebook to implement these policies.

That would allow people to get most of what they want, by preventing these work arounds, where the principles of anti-censorship, are backdoor ignored. While also simultaneously giving users the ability to moderate content that they see.

> have no more right to require facebook distribute

I agree that the law does not currently prevent facebook from doing certain things. Which is why the conversation, the entire time, was about making new laws, or changing laws, so that they cannot engage in what, in your own words, is a way to get around the 1st amendment, and engage in mass censorship, that is almost as bad as if the government did it.

Thats the problem!


>Today, the government cannot write censorship rules.

That actually was my point: the governments do write the censorship rules already and define punishment that you get for violating them.

Child porn is censored, for example. Here in Germany there is censorship laws against promoting national socialism or antisemitism.

Our democracies are structured in a way that legally prevents the legislatures to write censorship rules as they please and without good reasoning, and that is usually achieved by constitutional law guaranteeing freedom of speech to the widest extent possible, of the press, of assembly and of religion, and a division of powers and the mandate for those divisions to check each other. The courts but also multi party democratic systems, the press, voters and activists and protestors play a significant role in this.

While this system is not perfect, and there were and will be mistakes and abuses along the way that need to be corrected - which usually takes a lot more time than anybody likes - it is still the best we ever had and will ever get I think. The only real "alternative" we tried, that of a "strong" leader or oligarchic group of "strong" leaders was and is always a lot worse.

But now we moved a very significant part, if not a major part, of public discussion forums onto the platforms of private companies, and are now surprised that the "strong" leaders on top of these companies do not necessarily share our values or even the minimal set of values our societies as a whole agreed on, and make up and change rules as they please, without any real repercussions so far.

And not just that, they sometimes actively try to use their immense powers of content distribution and "moderation" to nudge discussion into one way or another. We already see these companies getting bolder in restricting certain topics all by themselves, be it around certain Corona related topics, be it to "fight white supremacy" ideology or other "extremism". And of course the companies started by censoring topics where reasonable people would most likely say "oh, those are dangerous or vile ideas by dangerous or vile people" [and then somebody brings up the Paradox of Tolerance to justify letting the companies reign free in those areas]. I am pretty much convinced that a lot of these are test balloons by the companies to see how far they can push the envelope right now before the momentary backslash becomes too much. And if we let them, then over time they will slowly push the envelope further and further for more and more control.

We and our politicians even help those companies to gain more power and control, by writing laws delegating policing responsibilities to the companies and by demanding the companies start policing more against misinformation, extremism, racism or and other of those -isms more and more. Not only is that a transfer of power, it's at the same time regulatory capture making sure any future competitors to these companies have it almost impossible to enter the market at all.

The first step on your route to doom is signing a petition requiring people to "provide id before making a social media account" (as is happening right now e.g. in the UK because some racists said some vile things about English football players), and then before you know it you end up with a "social score" system like in China, except it's the big companies maintaining it and regulating it and not even the (repressive) government.

Good intentions and all that...

[/rant over]


> But now we moved a very significant part, if not a major part, of public discussion forums onto the platforms of private companies, and are now surprised that the "strong" leaders on top of these companies do not necessarily share our values or even the minimal set of values our societies as a whole agreed on, and make up and change rules as they please, without any real repercussions so far.

But the alternative is that the government forces its own values onto the companies. I'm suspicious of "we're going to hide this information because its bad", but I'm also suspicious of "we're going to prevent companies from expressing values different from the ones the government allows".

I much prefer independent systems to a government that ultimately controls how and what corporations can say. The first is perhaps bad, the second is far worse.


You seem to want some sort of aggregator ... ? I'm not sure facebook's the best model to start with there. I think you'd be happier with a third-party service that could receive from FB, and then we wouldn't get into forcing facebook to carry stuff.


I don't want to force FB to do anything. I want to require all social media companies to allow aggregation (though ActivityPub etc) but if Zuckerberg/Facebook decide they don't want to be in that industry, I would have no problem with them simply closing down.


>Facebook should be forced to show me the feeds I choose to subscribe to.

What about constantly manipulating you to become aligned to the more extreme versions of your ideals, so you will subscribe to more extreme feeds?


>Facebook should be forced

I would consider this sentiment more absurd and dystopian than Facebook freely removing content from its own website.


Why? Facebook is not a person, it should exist solely at our discretion. "People should not be afraid of their governments, governments should be afraid of their people" applies even more strongly to corporations than to governments.


Corporations are just citizens cooperating in a venture. That shouldn't result in them losing rights simply because they are incorporated.

The law should be applied to relevant details of a situation, not something arbitrary like incorporation unless the issue is specific to the details of incorporation.


> Corporations are just citizens cooperating in a venture.

No, that's partnerships or unincorporated associations (well, “people”, not necessarily “citizens”). Corporations are distinct legal entities created by the power of the state and im ued with state-issued charters whose investors are granted the immense legal privilege of limited liability.

> That shouldn't result in them losing rights simply because they are incorporated.

The exercise of state power involved in incorporation and the privileges at public expense it grants should absolutely be tied to restrictions and conditions that ensure that a sufficient public benefit to warrant the cost is served and that the exercise of state power involved in the creation, existence, and operation of the corporation is consistent with the restrictions otherwise applicable to state power.


> Corporations are just citizens cooperating in a venture.

No they're not. They come with an extraordinary privilege of limited liability, which is meant to be paired with a corresponding responsibility to create a social benefit.


You are right.

On a different tack, if governments legislate some kinds of moderation for corporations, then the corporations would become an extension of the government and that would become government censorship in practice. Not good.

If governments legislate no moderation for corporations, then corporations would lose the ability to encourage different kinds of communication with moderation.

I don't see either case as healthy for freedom, capitalism, or individuals in the end.

Legislating that moderation rules must be made public, and moderation actions logged so the public could review a corporations compliance with its own chosen rules, seems reasonable to me. It would avoid deceptive moderation, which to me is a problem similar to other kinds of fraud that are outlawed.


All governments apply rules to corporations forcing them to do or not do things. Are you in principle against that?


Not at all. I am against arbitrarily forcing them to do or not do things, which is what some people are calling for here.

I think Facebook is a terrible site, and I don't even know why anyone would want to keep using it tbh.

edit: typo


Do you consider the already-existing (in other fields) common carrier arrangement to be absurd and dystopian?


[flagged]


There’s a huge gap between ‘legal but morally unacceptable to some’ and ‘blatantly illegal’. Making an argument against the latter while ignoring that the discussion is largely about the former is the very definition of a straw man.


Characterising people who want free speech as the criminals who will use that speech for the worst ends is no different to people who opposed a fair trial for all because criminals would want a chance to avoid jail.


Will somebody please think of the children?


>But what's the concrete proposal here? How would things actually work? Will the services be forced to accept whatever unmoderated feed of filth the linked to service sends their way? If not, how is this supposed to fix the moderation problem?

Does gmail block emails from your friends containing misinformation?

Why should facebook be allowed to do the same thing?


Gmail does all kinds of filtering on the incoming messages, such as detecting spam, malware and phishing. So it certainly isn't the case that they're just delivering every email to your inbox. Do they block misinfo? I genuinely have no idea.

But if it's not happening, it seems pretty obviously like an explicit product decision rather than some kind of regulatory requirement.


I think it's OK to say (1) you are allowed to send pornographic magazines through the mail from your place of business, but (2) you are not allowed to put a pornographic image on the front of your building.

I also think it's fair to carry this analogy over to the internet, and the difference between an email and a social media post.


A social media post is read only by the people who subscribed to the author, and those who followed the links to it. Not quite the same as a public building that random people walk past, and have to see regardless of whether they want to or not.


And that would be fine, social media companies don't have to promote "objectionable" content in curated public feeds, but people who deliberately subscribed to or are looking for that content should still be able to see it (like Trump's feed). That's the step people are objecting to, so the analogy you draw just isn't relevant.


> Does gmail block emails from your friends containing misinformation?

No, but they actively block emails from many legit servers in order to enforce an email oligopoly. Note I'm not talking about servers who can't DKIM sign or are on well-known blocklists.


Google is apparently going to be doing that with content on Drive. Not sure what they've said or not said about gMail but what's the difference, really?


The difference is that gmail isn't publicly serving your emails to an open audience. The restrictions on Drive are only for publicly-served content.


That isn't true, they already block you from downloading content from private shared folders that they decide to restrict.


So they say. For now.


I dunno why this is downmodded, given that e.g. OneDrive already applies its rules to all content, not just shared stuff; and Facebook is actively censoring even private chats (it will refuse to send some links, e.g. to certain websites about cannabis).


Don't they? If someone sends me spam/malware I expect gmail to filter that, even if it is from a friend's (hijacked) account.

I think there definitely is a very blurry line between spam/fraud and spam/misinformation.


There's no blurry line - both are spam if the recipient didn't ask for them. But at the same time, misinformation is not necessarily spam, and the ongoing brouhaha is largely about the kinds that people voluntarily choose to consume (and come back to ask for more).


A good question, but much of it does get answered in the linked articles discussing the proposal.

Large social networks already have API's with partners, but who and how they are allowed to be used is defined through legal agreements and access controls. The access act is intended to regulate how those legal agreements over the API should be defined.

The EFF wants to allow third-party to ask consent of those users in order to use those API's for interoperability. In theory this would not put any additional requirement on the social network, and the consent request could occur out-of-band. The only change would be the legal ramification and access control.

In theory there could be a dominating social network without any API for third-parties, in which this proposal might be defeated. In practice that is very unlikely given the need of large companies. The proposal could also require if needed that internal API, if they exist (which all of them have as a matter of practicality), get opened up in the lack of any external API that can be used to enable interoperability. I would however guess that such a broad brush would be problematic, and a bit unnecessary since again every large social network already has third-party API's.

Do Discord have an API that a selected group of partners has access to? Do Instagram have it too? Given the correct permission, interactions between those two networks would be a matter of a third-party using those API's.


> In theory this would not put any additional requirement on the social network

This falls apart as soon as you examine it though. If the law only required a platform to give competitors access to whatever API they currently have without actually detailing what needs to be accessible then it would be trivially easy for platforms to change/restrict those APIs in ways that make them useless. And as soon as you try to address this you're back at the original problem of how you create a standard model that works for everyone.


Why does a town square need moderation?

You can't just ignore the fact that social networking has become the new town square because its inconvenient


The town square analogy would be fine if there was a ‘net neutrality’ for social media: they have to show me everything I follow in chronological order with no filtering or recommendations. As it stands it works more like the letters page in a newspaper, where an editor is deciding what gets included and what doesn’t.


>You can't just ignore the fact that social networking has become the new town square because its inconvenient

That's not a good analogy. Facebook is most used as a semi-private space (akin to an open-house), or as just private (private party with friends). No one moderates that; you don't want to listen to it, you don't join the party (aka follow the person/page).

In a town square, if you scream profanities, harass people, etc, they have to listen because it's the only town square.


I would say that social media sites are more akin to a bar/coffee shop/pub/public house.

While free speech is not protected there by the first amendment (as far as I know), it is still important to value it broadly to create an open atmosphere. I think of the role they played in the run-up to the American Revolution and in organizing workers, and until the dawn of the internet and the plasticized commercialization of my town, the role it played in my social and artistic life.

But what you say is an excellent point about where we are now engaging in public discourse; it has changed drastically, very quickly, and there are significant concerns about free speech now that we all hang out on websites mostly owned by large corporations.

It feels gross to me. Like meeting up at the food court of Wal Mart because every other place in town has put out of business. And the town square is almost dead and gone, both online and in real life.


What's wrong with: anything a user can do on your app in the interface must also be doable via API call by that user.

You don't have to map everything so the platform interactions are the same, simply make them equal-access via API call.

It seems flawless to me but I also only spent the time I took to type this thinking about it, so it may be 98% flawed.


Your solution doesn't address at the problem the article was describing. The stated problem is the "censorship" by tech companies. Allowing posting via API rather than just a user interface does nothing to solve that. If it doesn't solve the problem, why are we doing it again?

Your solution creates the new problem that normal users don't understand what access they're granting to an API. Every API you expose is a tool for an unscrupulous app to exfiltrate data about you or your contacts. It's an avenue for sending spam. It's an avenue for monetizing your account's reputation by taking actions such as liking content, following/subscribing channels to boost their engagement numbers, scraping content that your contacts had not shared publicly.

It makes abuse-fighting much harder, since you can no longer trust the application to collect any kind of bot detection signals, and since huge tons of user interactions will be getting artifically concentrated at a few API gateways.

And finally, while nobody has a right to their business model, it's maybe useful to consider what the implications would be. It basically makes ad-supported services impossible, since somebody will inevitably implement an ad-free client for the service. (Or even worse, a client that just shows their own ads instead). Do you think we're really willing to go back to a world where the only free (as in money) services are ones that are being subsidized?


What if the platform added 1 day of latency to every call?


Why not platform-specific moderation? All social media content is available for consumption by a set of platforms the user authorizes. Those platforms are free to filter or moderate the content as they see fit. While not explicitly spelled out in TFA, this seems like the trivially obvious implementation.


Don’t we already have platform specific moderation? Isn’t that what people are complaining about?


We have platform specific moderation, in conjunction with high switching costs. If the content and connections themselves are owned by the platform then you have high switching costs. If the content and connections are instead owned by the user and all platforms share a compatible protocol for exchanging content/connections data, then the switching cost vanishes because there is no switching.


Supreme Court of the United States Justice Clarence Thomas opined couple months ago discussing big tech censorship quite extensively. The case was regarding whether Trump was allowed to block people on Twitter and it being a 1st amendment violation. While the case was declared moot as Trump left office, Justice Clarence Thomas took the opportunity to discuss censorship. How politicians aren't allowed to block users on big tech but big tech is able to block and ban politicians and government employees and how this creates a weird power dynamic.

I would highly recommend reading his opinion:

https://www.supremecourt.gov/opinions/20pdf/20-197_5ie6.pdf

Here's a few excerpts:

> "But whatever may be said of other industries, there is clear historical precedent for regulating transportation and communications networks in a similar manner as traditional common carriers. Candeub 398–405. Telegraphs, for example, because they “resemble[d] railroad companies and other common carriers,” were “bound to serve all customers alike, without discrimination." ... "Internet platforms of course have their own First Amendment interests, but regulations that might affect speech are valid if they would have been permissible at the time of the founding. See United States v. Stevens, 559 U. S. 460, 468 (2010). The long history in this country and in England of restricting the exclusion right of common carriers and places of public accommodation may save similar regulations today from triggering heightened scrutiny—especially where a restriction would not prohibit the company from speaking or force the company to endorse the speech." ... "The similarities between some digital platforms and common carriers or places of public accommodation may give legislators strong arguments for similarly regulating digital platforms. [I]t stands to reason that if Congress may demand that telephone companies operate as common carriers, it can ask the same of ”digital platforms." ... "For example, although a “private entity is not ordinarily constrained by the First Amendment,” Halleck, 587 U. S., at ___, ___ (slip op., at 6, 9), it is if the government coerces or induces it to take action the government itself would not be permitted to do, such as censor expression of a lawful viewpoint. Ibid. Consider government threats. “People do not lightly disregard public officers’ thinly veiled threats to institute criminal proceedings against them if they do not come around.” Bantam Books, Inc. v. Sullivan, 372 U. S. 58, 68 (1963). The government cannot accomplish through threats of adverse government action what the Constitution prohibits it from doing directly. See ibid.; Blum v. Yaretsky, 457 U. S. 991, 1004–1005 (1982). Under this doctrine, plaintiffs might have colorable claims against a digital plat- form if it took adverse action against them in response to government threats. The Second Circuit feared that then-President Trump cut off speech by using the features that Twitter made available to him. But if the aim is to ensure that speech is not smothered, then the more glaring concern must perforce be the dominant digital platforms themselves."

> "As Twitter made clear, the right to cut off speech lies most powerfully in the hands of private digital platforms. The extent to which that power matters for purposes of the First Amendment and the extent to which that power could lawfully be modified raise interesting and important questions. This petition, unfortunately, affords us no opportunity to confront them." The last 2 points are important as Justice Thomas is basically saying "give us a case which brings up these two questions and then we will have a deep look."

One can even cite Amazon's recent censorship of SCOTUS Justice Clarence Thomas's own documentary as well as Eli Steele’s documentary as examples:

https://archive.is/aNv3B


>First of all, the internet’s “marketplace of ideas” is severely lopsided at the platform level

The marketplace of ideas itself has been show to be a bust in recent years, IMHO.

Better ideas, better models of truth and reality don't win out. People have not shown themselves interested in intellectual debate or argument. We do not see the 'best' ideas rise to the top. And before anyone asks who I am to judge the best ideas, I'll answer - nobody, but I can see that factually incorrect, scientifically illiterate, conspiratorial, harmful crank ideas gain legs in the online world we've created. Often at the root of these is a profit motive.

Is censorship the best way to deal with it? Probably not. But we do need to recognise this as a problem as well as getting het up about giant internet platforms deplatforming people.


It's ironic (or maybe appropriate) that this, talking about how better ideas don't rise to the top, is the top comment. The most abhorrent idea of all to me is that there are stupid, weak people that are worse than "us" that we need to shield from bad ideas because they are not smart enough (like we are) to handle them. Lots of support for this view in downstream comments (we're just intelligent apes etc).

The strawman of the existence of factually incorrect content is irrelevant in my view. Everyone loves to jump on conspiracy theories or whatever other stuff is out there as proof that people (other people) are too dumb to be exposed to the world. First, I think the relevance and the number of serious adherents in the extreme version of these conspiracies etc are dramatically overstated because they reinforce a narrative. Second, I think most of it is an effect, not a cause, of people being told what to do and what to think. People uncomfortable with narratives that end in them giving something up will take shelter in alternative explanations. The problem is not the nutty theories being available, it's in the way that one group forces itself on another in the first place.

All this to say, we can be adults and let everyone get the same information and make their own choice, or we can go the other way where someone who thinks they know more than us (and you hope it's you but it doesn't have to be) gets to decide. Frankly, the latter idea is ridiculous to me.


> The most abhorrent idea of all to me is that there are stupid, weak people that are worse than "us"

Not what I said. However the idea that everyone is equally able to parse out good information from bad is absurd, and I make no claims to be one of the group of enlightened ubermensch who should be able to decide for the rest. Neither do I argue that such people exist or are necessary.

> we can be adults and let everyone get the same information and make their own choice, or we can go the other way where someone who thinks they know more than us (and you hope it's you but it doesn't have to be) gets to decide.

Or we can ditch bland cliches and false dichotomies and look at pragmatic steps that can be taken to, for instance, improve access to good information, improve education around critical thinking, and perhaps (as mentioned in other comments) re-examine things like automated algorithms that feed people ever more extreme content because it increases engagement levels, clicks and ad revenue.

But no, of course, you're right, taking any action at all is akin to fascism. Totally.


I don't understand how one could look at the current state of affairs regarding rampant misinformation online and say "yeah this is the best we can do without compromising our ideals". Its not good enough.


I think that's the real issue in all of this discussion: there is simply no will.

It's too hard, it's too impossible, we've decided big tech has already won, we've ceded decades of open progress in tech to moguls who don't give the fuck about us, or we're so ideological that any step to the "left" or "right" is perceived as abdicating principles of freedom.

My gp was a former-Jew (thanks anti-semites) that flew over nazi germany. My other grandfather served in Europe after that whole debacle. I feel I have as much right as anyone to say that it's bullshit going on Reddit and running into the same hateful misinformation on every thread, of "arguing" with holocaust deniers on conspiracy that think Sly Stallone kisses dolphins, use the same canards they've been suing for 2k years, and I can promise you were the first to notice my crypto-Jewishness. These same dummies love Q and have never found covid information that fit their narrative that they didn't love.

We really gonna relitigate (and lose) historical issues like the Holocaust for the next 10,000 years? We gonna sit by and whine about "principles" as people are murdered on a daily basis because of misinformation like that? That's not good enough.

What marketplace of ideas? How many serious challenges have there been to big tech by any company in the last decade? What freedom is there if you can't walk out your front door without being directly impacted by disinformation of various kinds on a daily basis?


I'm sorry but I don't quite understand how Reddit--a bastion for young 20-somethings, among which leftism is a major demographic, could ever be construed as a Q-Anon stronghold.

I feel as though this may more reflect your fears than the state of the site, and I feel as though so reflexively writing off a population that already largely agree with you illumines the opposite argument--that reaction to speech is largely an overreaction, and we will, without careful consideration, largely consider any bloc to be constituted of what we fear.


You seem to be projecting onto me what you think I'm projecting. ;)

I didn't call it a stronghold. There are certain subreddits that were aligned. The donald before it moved, conspiracy, conservative, etc. You ever see the greatawakening subreddit where they were acting like Trump was giving secret messages in speeches and calling for executions?

If that exists on a site with leftism as a major demographic, what does it look like elsewhere? You're making my point for me.

BTW, one of the longest-running mods of conspiracy that finally got banned has admitted that they are a Russian national. Not that this means anything, but it's interesting that a subreddit could be dominated by an individual with such strong beliefs about politics in another country.

https://www.dailydot.com/debug/r-conspiracy-axolotl-peyotl-b...

To give context, I'm very much what some people would mock as an "enlightened centrist" (I felt so sad when I realized this is bad?). I think much of the "hateful mistinformation" is actually more aligned torwards the left. I can't read a thread about pitbulls or my home state without people frothing at the mouth and acting like my dog should be put down today or that my state is an ISIS stronghold.

But consitutionally? If someone is being an asshole on your property, it's your right to kick them off.


I don't believe I am, though--you are, if I'm not mistaken, asserting that it's some prolific iniquitous undercurrent which these platforms are subtending in some manner. (Which would seem somewhat misleading to me, as your assertion that r/TheDonald moved was of their own volition, rather than a banning by the site.)

Any platform harboring content will, as a matter of course, simply through the caprice of a moderator, let slip by insane opinions--but opining that these are some growing tide and, more dangerously, representative of their moderate counterparts (a la the r/Conservative subreddit, which seems constituted of largely by-the-numbers right of centers,) seems disrespectful to all parties involved and serves only to distract from your central assertion that communicating these ideas will in some way seed wanton chaos. (Comparing the ideas directly to those that precipitated the tragedy of the Holocaust.)

Edit, as I'd written my reply to a previous version of your own: I am sympathetic to the idea that seeing these more fringe ideals is unfortunate--but the argument which I believe bears greater importance is that acting in this manner against them, striking them from the whole of our public discourse and pre-empting any who could, in some way, divine inspiration from the muck is far more deleterious to discourse. It serves all too easily as a means to silence disquiet and cast a veneer of unanimity.


I think the number of discussions that are off-limits should be very small and platforms should be much much more transparent.

However, if allowing certain discussions means also allowing other discussions, I'm not broken up if sites like reddit were to ban a subreddit like conspiracy or at least try to reshape it to something much more objective.

If the owners a property decide certain views are abhorrent, that's their right, I can't think of a valid moral or legal complaint against that - it is their property. If we lack competition that is an issue of market competitiveness more than propaganda.


You're correct in that it's really a market capture problem as things are. But the popular proposals that try to co-opt Big Tech into the censorship game are popular precisely because of that market capture - it's a way to make it extensive without putting the government in charge of it explicitly.

So we can't really treat these two as completely separate right now. Indeed, if those schemes are allowed to go forward, the next thing you'll hear is that we can't break Facebook etc up, because doing so will limit how effectively some information can be suppressed. The more power is concentrated, the more it seeks to sustain that state of affairs, and the better it is at that - so why would we hand those companies so much power when they already are a major problem?


Who decides what's off limit?


Whoever makes that decision for each platform.


There're certainly multiple places one could draw the line--the efficacy of a given position for these sorts of things varies by your objective or simply the severity one perceives.

Thanks a ton for providing the opportunity for some discussion on this!


Who is being murdered on a daily basis because of holocaust deniers?


https://en.wikipedia.org/wiki/Pittsburgh_synagogue_shooting

https://en.wikipedia.org/wiki/2019_Jersey_City_shooting

Every day? No. But when it's 2021 and you gotta worry about getting gunned down for being a Jew?

Putin loves to use anti-Semitic rhetoric when convenient. Which is the big part of this...the stand pat and do nothing approach doesn't work when the resources of nation states can (logically) and have been behind harnessing disinformation.

The Jewish issue isn't the only issue, better examples might be the Christchurch shooting where the gunman was livestreaming on and because of 4chan.

If you get a chance check out HBO's doc about Jim Watkins and Q.


Christchurch shooting is a good example, because the follow-up crackdown on associated content showed just how far this can go. Remember his "manifesto"? In NZ, its distribution was banned outright by law (or rather government order, but they have laws on the books that allow for it). Not so in Australia - they couldn't find any legal means to restrict it, so the government basically informally asked the ISPs to "do something".

And they did - to the point where a bunch of websites with forums were blocked outright because of their hands-off policy wrt comments (usually in some particular subforum; it's a fairly common way to keep it civil elsewhere) meant that there were a bunch of posts with links to the document.

The end result is that a bunch of completely unrelated stuff was blocked in Australia outright for a while, by private companies in charge of communications acting in unison - effectively, a censorship cartel - with no political or judicial recourse, since the government was not involved in it, and the ISPs were in their right, legally speaking.


Ironically, someone (6f8986c3) replied to this claiming to be Jewish and sharing his experiences...and his post was flagged and removed... In a thread about censorship.


He didn't do it productively, he made it into a rant about political parties; you can't get useful discussion on such a loaded topic like that.

That's another part of this...you can't just say whatever you want and expect there not to be consequences.


It was no more of a rant than yours. But you disagree so you downvote and flag.


See?


I don't understand how one can say "it's not good enough" without proposing options that are actually better. I've yet to see any; all the censorship proposals on the table are far worse than the present state of affairs.


What, we aren't allowed to talk about issues if we haven't already solved them? Surely you can see how ridiculous that sounds. Ironically, you're also saying "these proposals aren't good enough" without proposing options that are actually better.


Of course we're allowed to talk about them; it's just that "it's not good enough" isn't helpful.

The option that is actually better than all the proposals on the table is what we have right now. I'm not claiming it's perfect, or even good - merely better.


> improve education around critical thinking,

Aye. This would improve so many things. Critical thinking is, in fact, not an instinctual ability. It's perfectly okay to talk mathematics illiteracy, but propaganda literacy? "No, no, we can't imply that people aren't capable of thinking for themselves." We have multiple historical instances of extremely successful disinformation campaigns, and people fell for it en-masse.

These "let people decide" trolls are effectively correlating critical thinking skills with intelligence, and it's patently absurd. Philosophers, some of the greatest thinkers there are/were, have spent years of their lives pondering the art of critical thinking.


Educating people with better critical thinkings skills is great. Deciding that educating people is too hard and that we need to allow large monopilistic platforms to decide what can be said with zero liability is much less good.

Mandating data interoperability between platforms seems like a good middle ground that allows diversity of moderation while reducing the monopoloatic power of large platforms.


> I make no claims to be one of the group of enlightened ubermensch who should be able to decide for the rest

I would say you did:

> And before anyone asks who I am to judge the best ideas, I'll answer - nobody, but I can see that factually incorrect, scientifically illiterate, conspiratorial, harmful crank ideas gain legs in the online world we've created. Often at the root of these is a profit motive.

You asked the right question (who are you to determine the truth?) but then you didn't answer it. You continued on to imply that, in fact, you are in a position to sort out what's true unlike those other, inferior people who fall for conspiracy theories and pseudoscience.

> Or we can ditch bland cliches and false dichotomies and look at pragmatic steps that can be taken to, for instance, improve access to good information, improve education around critical thinking, and perhaps (as mentioned in other comments) re-examine things like automated algorithms that feed people ever more extreme content because it increases engagement levels, clicks and ad revenue.

"Improving access to good information" is usually a euphemism for some kind of censorship. As far as I'm concerned, all of these "problems" are not problems, they're justifications for controlling other people and expressions of impotent rage at the failure to control those people. I'm vaccinated and I voted for Biden but I do not understand why peope care that others decide not to get vaccinated or voted for the other guy. Part of freedom is the freedom to be wrong.

I think these are not new problems. Democracy, pluralism, free speech, trial by jury, etc are our solutions to these problems. They aren't particularly satisfying solutions because, as you said in your first post, the truth doesn't always win. Rather they're tragic compromises. But I don't see any serious suggestions about how we should improve them.


> You continued on to imply that, in fact, you are in a position to sort out what's true unlike those other, inferior people who fall for conspiracy theories and pseudoscience.

You missed that I was drawing a distinction between what is commonly thought of as the market place of ideas - in which rational discourse and honest debate enables people to learn, compare, agree and disagree, find greater truths, learn, etc etc - and the spread of misinformation based on falsehood, often perpetuated for profit.

We absolutely can, as a society, say that (for example) misinformation about vaccines, where we are dealing with matters of fact, is not on a level with actual vaccine science. This is not really up for honest debate. We can debate until the cows come home about what it means, whether any opinions should be drawn or any action taken, but motivated lies are just not on the same footing as factual information.

> "Improving access to good information" is usually a euphemism for some kind of censorship.

You assume bad faith here, again. I think there’s a lot can be done by way of giving access to scientific information in accessible ways.

> I'm vaccinated and I voted for Biden but I do not understand why peope care that others decide not to get vaccinated

Well firstly Because that decision affects more than just those individuals, and puts others (including the already vaccinated and those who cannot be vaccinated) at risk. And secondly because some of those making the decision not to vaccinate, and putting themselves at risk, have done so armed with bad information.

The nature of the problems may not be new, but the scale and severity seem to be.


> with actual vaccine science

Who decides what that is? Plenty of actual scientists publish papers in actual science journals that conflict with each other and with official public health advice. The whole vaccines cause autism idea came from what looked like actual vaccine science.

Knowledge isn't actually that sure. Sometimes the authorities are wrong. Remember when bread was at the bottom of the food pyramid?

> but the scale and severity seem to be.

Seem to be or actually are? Are you using "facts" or media sensationalization to form your opinion?


So there’s no way at all, to your mind, to distinguish millions of shitposts about “The covid vaccine causes infertility, and contains 5G chips” from, for instance, genuine risk information?

We truly are swimming in a world in which nothing is true. Or you are anyway.


I can make up my own mind for my personal opinion but I don't want other people's ideas to be censored. Even if they're factually wrong, it's still OK to share them, I think.


> The covid vaccine causes infertility

The bizarre thing with this hoax, is that it looks like the mRNA vaccine increases swimmer motility.[1] Vaccinated folks should have no issues having babies if, for example, they need to improve their 5G reception at home.

[1]: https://jamanetwork.com/journals/jama/fullarticle/2781360


You take an extreme example (5G chips in vaccines) to support the idea that we can draw a line in the sand and say what's true & what's false. But if you take an example closer to the line, things start to become less clear. Are masks useful to the general public? If you said "yes" a year ago, you would've been censored for spreading misinformation. If you said "no" yesterday, you would've been censored for spreading misinformation.

Anyway, if you think it's so easy to do a good job censoring information, why don't you point to a single example where that worked out? Just one is sufficient. It's okay, I'll wait.


You take a moderate example to support the idea that we cannot nevertheless have a threshold where we can be sure that something is false. But while we cannot clearly say where the land ends and the ocean begins, there are large swaths of places that we can positively identify as ocean.

The OP never claimed to have an appropriate solution so I don't know what you are talking about regarding them thinking censorship is easy.


Here's a claim that's definitely in the ocean - Mohammed is the messenger of God. Do you want that claim banned from the internet because it's clearly false (God doesn't even exist)? Do you want to exterminate Islamic faith?

Just because it's wrong, doesn't mean nobody should believe it or be exposed to it.


That particular statement is in fact neither known to be true nor false for some definitions of the Muslim God, since there is no secular evidence for the existence nor non-existence of God for those definitions of the Muslim God.

But I agree with the point you are trying to make, that many people have false beliefs crucial to a societal institutions that cannot realistically be suppressed by heavy-handed censorship without also destroying meaning and satisfaction for many people, and also the society itself. And of course, holy texts often contradict themselves, yet some consider all of it true.

However, you are arguing against a strawman, since OP never claimed to have a solution, nor even that censorship was the appropriate response, not do I claim that either.


> However, you are arguing against a strawman, since OP never claimed to have a solution, nor even that censorship was the appropriate response, not do I claim that either.

OP strongly implied that it's easy to distinguish which claims should be censored and which claims should not be censored. This is complete fantasy.


They didn't claim anything that strong. They claimed that some COVID misinformation is easily identifiable as false. They were silent on many other statements, whereas you think they said that all statements are easily distinguishable as true or false. Indeed, it's complete fantasy, but you're arguing against a strawman.


> The OP never claimed to have an appropriate solution so I don't know what you are talking about regarding them thinking censorship is easy.

Disagree. This is what OP said, among other things: "We absolutely can, as a society, say that (for example) misinformation about vaccines, where we are dealing with matters of fact, is not on a level with actual vaccine science. This is not really up for honest debate."


You are confusing the identification of certain statements as false (which is easy for some things), with the enforcement of censorship (which OP does not claim to have).


Ok. Allow me to rephrase: can you point to a single example where an entity was given power to decide which statements are false (for censorship-related purposes) and subsequently did a good job? To be specific, I mean doing a good job of identifying which statements are false and which statements are true.


Well, that's not something the OP has a good answer to, and they admit it. In fact, they've written,

> Is censorship the best way to deal with it? Probably not. But we do need to recognise this as a problem as well as getting het up about giant internet platforms deplatforming people.

> make no claims to be one of the group of enlightened ubermensch who should be able to decide for the rest. Neither do I argue that such people exist or are necessary.


> The most abhorrent idea of all to me is that there are stupid, weak people that are worse than "us" that we need to shield from bad ideas because they are not smart enough (like we are) to handle them.

Gullible people exist. I mean, it's such a well-known phenomenon that the term "gullible" exists to describe people who fall for what should be obviously false information and lies. We all have our instances of gullibility, and we all have domains that we know nothing about, and there are countless people who prey on people with lack of knowledge in some area.

If people go around in real life deceiving people, someone might step in and tell them to watch out for the scam. There'll be signs in tourist areas where scammers frequent. Banks will send out notices on how to help their customers avoid financial traps. Schools teach kids to look out for bad information and misleading conclusions. Signs are posted in parks warning people not to venture out to certain areas because they're likely unprepared and could die (and people still die anyways).

This is all generally accepted as good.

But flag a post on some social networking site with a "This might be fake. Be careful", and people go wild saying the elite are trying to police us and they're destroying our freedoms.


Flagging a potential scam or potentially self-serving misinformation is not censorship and is very different from blocking content and banning posters.


The term "elf" exists to describe a kind (well, a few kinds) of supernatural vaguely-humanoid being. This is not generally considered evidence for the existence of elfs.


Exactly. Only a gullible person would think so.


> All this to say, we can be adults and let everyone get the same information and make their own choice, or we can go the other way where someone who thinks they know more than us (and you hope it's you but it doesn't have to be) gets to decide. Frankly, the latter idea is ridiculous to me.

This is a false dichotomy and a rather romantic notion. Independent adults rarely, if ever, make their minds from first principles. They do already cluster around people who think/convince others that they know better. So few of what we’re talking here is original thought, we are mostly riffing on what we’ve read elsewhere. And that’s OK. That how distributed cognition saves us a lot of time, when working correctly.

But distributed cognition of independent adults get to super irrational conclusions too, I don’t need to list past atrocities based on ridiculous ideas.

So while the failures and wins of distributed cognition is yet to be understood properly, it alone is not the culprit nor can be corrected via the virtues of atomized individualized cognition.

One thing is clear, ideas distributed over ad fueled companies’ screen estates conform to a particular topology and a particular objective function. The normativity of engagement is not that of reality. And the irrationality of those systems effect us even from afar, permeates into our culture and even into non-ad based forums like this.


> The most abhorrent idea of all to me is that there are stupid, weak people that are worse than "us" that we need to shield from bad ideas because they are not smart enough (like we are) to handle them.

I find this idea abhorrent as well, but I’m curious if you find it abhorrent because you think the claim is false and/or unsubstantiated, or if you find it abhorrent because even if it’s true it ought to be rejected due to some moral principle.


(not parent) I think it is pretty easy to substantiate it (some people are actually deficient), but hard to substantiate to the degree that would justify the measures people propose. Yes, you can find a few nutjobs, but do they justify these truth-seeking policies for the general population? What suggests that these policies are even beneficial?

But even if it was shown to be beneficial, I would still be against it on moral grounds I suppose (but I'm not sure since that might depend on how "beneficial" it is).


I've not really thought about it this way, but I believe both statements. Nobody should be a slave to how someone else wants them to behave, because people are autonomous beings with their own capacity and right to make their own decisions, and there is no universal law about how we should decide or prioritize things. I believe of course in many of the societal constructs we have (most of the ten commandments e.g., taxes) but I think there has to be an almost impossibility high bar in any new imposition we make on peoples right to have their own priorities, because we're all equal in the universe.

Maybe a weird answer, I'm not a philosopher, I just don't think there is a scale of rights that people have, or in the idea of an elite that has special decision making power because it is enlightened.


The best statements of this problem are in Public Opinion and The Phantom Public by Lippmann. You don't need to share his answers to those questions (which are an enlightened corporatism/fascism), to realize that to be serious about governance, answering them is a prerequisite.


>The most abhorrent idea of all to me is that there are stupid, weak people that are worse than "us" that we need to shield from bad ideas because they are not smart enough (like we are) to handle them

This is the basic hierarchical nature of any complex society. In a well functioning society those people, called elites (today already considered a derogatory term), are being funneled to the top where they're put into positions of power.

I don't really know why that's in itself supposed to be abhorrent, because it's how any functional organisation is structured. The problems start when institutions break down and the quality of your elites is reduced, but the solution is not some sort of choose your own adventure story where you let the inmates run the asylum. There's no organisation on earth that survives like that.

The entire notion that the vast majority of adults 'make their own choices' is a complete fiction to begin with. Choices exist downstream from culture and culture itself is produced by elites and consumed by 'adults', and so the choice you have to ponder is which people you want to be in charge of producing your culture. Might be Harvard, Zuckerberg, Tucker, or the Pope, but the people have nothing to do with it.


This assumes that people reach "elite" status due to merit rather than to to complicated in-group favoring dynamics that serve to entrench certain class/ethnic/political interests. The more gatekeeping you allow the elites to do, the less meritocratic the elites are.

So the idea that some people are better at some things than other people is indeed true, but the idea that we can reliably measure that without the metric being gamed and corrupted is false.


> Choices exist downstream from culture and culture itself is produced by elites

You have it backwards. People who produce or shape culture become elites, it's not that people who are already elites are the only ones producing culture (or rather, influencing it). That has very different implications from what you're describing.

In fact, elites who intentionally try to shape culture often get laughed at (cue the "Imagine" video).


The elites at any given time are actively promoting this premise. But, seeing how elites get wiped out and replaced now and then, and society keeps going on, I think this "basic hierarchical nature" very much ought to be questioned by anyone who does not fancy becoming part of the current or future elite themselves.


It's not ironic, and not surprising. HN is very willing to "censor" (i.e. moderate, ban, etc) and on top of that, "politics" is considered off-topic altogether according to the guidelines.

In such an environment, the better ideas often do rise up to the top. Twitter and Facebook are not such environments.

"We" shield ourselves from bad ideas because this forum would be overrun if we didn't.


> we can be adults and let everyone get the same information and make their own choice

One of the problems is that we don't all get the same information.

> or we can go the other way where someone who thinks they know more than us (and you hope it's you but it doesn't have to be) gets to decide. Frankly, the latter idea is ridiculous to me.

You've bumped into the exact same problem that exists with our democracy. Do you want our future determined by a few well-informed, rational people? Or do you want it determined by an ignorant, short-sighted mass?


> Do you want our future determined by a few well-informed, rational people? Or do you want it determined by an ignorant, short-sighted mass?

"I am obliged to confess I should sooner live in a society governed by the first two thousand names in the Boston telephone directory than in a society governed by the two thousand faculty members of Harvard University."


Instead of making a snarky smear upon the faculty of what is probably the most coveted university in the world, why don't you just come out and make an informed argument on what you find so wrong with them?

“There is a cult of ignorance in the United States, and there has always been. The strain of anti-intellectualism has been a constant thread winding its way through our political and cultural life, nurtured by the false notion that democracy means that 'my ignorance is just as good as your knowledge.”

― Issac Asimov


As an immigrant to America, I have thought hard about the differences between it and the developing country I came from. We don’t have any shortage of intellectuals, and those Harvard professors export their ideas for the whole world. Professors at elite colleges have created exactly zero prosperous nations. I’m convinced that what we’re missing is ordinary Americans.


> Professors at elite colleges have created exactly zero prosperous nations.

It turns out that you can make that statement for any group of people. It takes all kinds of people to build prosperous nations.

It'a also particularly ironic that you're writing this because of a project that professors (and grad students) at some elite colleges started back in the 1970s, with some funding from the defense department. And their creation (the Internet) led to historic amounts of wealth creation in a stupendously short amount of time.

I also wonder where Britain would be without the numerous professors at elite colleges whose research in physics, mathematics, and chemistry made the Industrial Revolution possible.


We’re talking about who to put in charge. The US was built with basically just a few kinds of people in charge: farmers, lawyers, businessmen, and military officers. It was already wealthy before there were any Ivy League professors doing anything.

There are countries where the government was heavily influenced by academic theories. Marx was a PhD and a professor and his political philosophy was the basis for several quite unsuccessful efforts at governance.

This is not a knock on academic elites in general. I want them sitting around and building the Internet! But I want a farmer actually running the country.


> The US was built with basically just a few kinds of people in charge: farmers, lawyers, businessmen, and military officers. It was already wealthy before there were any Ivy League professors doing anything.

The Ivy League wasn't the "Ivy League" back then - just a bunch of fledgling colleges without much of a reputation. America was a different country. Smart, ambitious people went into planting, trading, or soldiering. What things were like 200+ years ago has no bearing on how they should be today.

> Marx was a PhD and a professor and his political philosophy was the basis for several quite unsuccessful efforts at governance.

One academic's theories led to some failed governments, therefore academics are bad at country-building? Marx wasn't even in charge of any of those countries. Other people (such as Lenin, a lawyer, and Trotsky, born to a farmer) read his work, went "this shit sounds dope", and tragedy ensued.

I can point you to any number of countries run by "farmers, lawyers, businessmen, and military officers" that weren't as successful as the US. As I already mentioned, Lenin was a lawyer. So was Fidel Castro. I hope I don't need to go through a list various military dictators.

> But I want a farmer actually running the country.

I want smart people with good ideas, integrity, and communication skills running the country. You can find people like that in every field. I couldn't care less what their background is.

The US succeeded (in part) because the early leadership was of incredibly high caliber and integrity. George Washington stepped down voluntarily from the presidency after two terms. It seems normal to us but it was unthinkable back then. Everyone just assumed he'd stay in charge until he died.


> One academic's theories led to some failed governments, therefore academics are bad at country-building?

No, that’s just the most egregious example. Circling back to my original point: I spend a lot of time thinking about what makes countries rich and how do you turn poor countries into rich ones. Academicians spend tons of time studying psychology, culture, politics can etc. But I can’t think of any of those theories that has ever been implemented in the real world to help a poor society develop into a rich one. When Lee Kuan Yew, who made Singapore rich in a generation, talks about development, he doesn’t talk about academic theories. He focuses on the culture of the ordinary people.

> I want smart people with good ideas, integrity, and communication skills running the country.

No. Ideas don’t make countries successful. Integrity, yes, but academics don’t have any special advantage in that regard. And the petty bureaucracies and box checking environment of academia selects against many of the other traits required for good leaders.

> I couldn't care less what their background is. The US succeeded because the early leadership was of incredibly high caliber and integrity.

The US succeeded because they were English by culture and departed little from the Anglo system of government and society, which was not created but evolved organically over centuries.

Revolutionary France is a good example of a society structured by “smart people” according to “good ideas.” It ended in disaster and bloodshed.


> When Lee Kuan Yew, who made Singapore rich in a generation, talks about development, he doesn’t talk about academic theories. He focuses on the culture of the ordinary people.

He (and most national leaders, elected or otherwise) works at too high a level to talk about academic theories in general conversation. You don't think he and the people working in his government studied economics, history, and politics? They just made Singapore prosperous by getting everyone to work harder and study more?

> academics don’t have any special advantage in that regard

Neither do "farmers".

> And the petty bureaucracies and box checking environment of academia selects against many of the other traits required for good leaders.

That's just broad stereotyping. We can do that for any of the professions you think should be in charge. "Military officers are rigid, hawkish, and overly inclined to action". "Businessmen are short-sighted and focus on the bottom line above all" and so on.

> Revolutionary France is a good example of a society structured by “smart people” according to “good ideas.”

Not really.


> He (and most national leaders, elected or otherwise) works at too high a level to talk about academic theories in general conversation. You don't think he and the people working in his government studied economics, history, and politics? They just made Singapore prosperous by getting everyone to work harder and study more?

There is a difference between people with judgment governing with the advice of academics, and putting academic theories directly into action. A good example during pandemic were governors who listened to doctors but applied their own judgment after listening to other stakeholders, and those who outsourced decision making to credentialed experts.


The amount of goalpost moving in this comment thread left me quite dizzy.

We went from "academics can't build nations" to "academics can't lead nations". Then to "America became great because farmers, lawyers, and soldiers were in charge" (while ignoring failed nations that also had farmers, lawyers, and soldiers in charge). Finally landing upon Lee Kuan Yew and national character building (no idea how that's related).

There were no actual examples of an academic taking charge of a country and that country failing due to them being an academic (vs just being corrupt, despotic, insane, or plain incompetent). On the other hand, I provided lots of examples that showed academics have made modern society and the economy possible.

I'd argue that most academics don't have any interest in politics or leadership, which causes them to be relatively underrepresented in the arena. Regardless, I'm not biased against people due to their profession, as you appear to be.


>Professors at elite colleges have created exactly zero prosperous nations.

You mean all by themselves? Well neither have plumbers, but I'm not about to disparage them over it, either.


Are you really asking for an informed argument about what's so wrong with the rule of the Ivy League optimates? Isn't it pretty simple just to cite Robert McNamara and call it a day? What am I missing about the complexity of this issue?


They are >90% very left leaning, which is common in elite academic institutes, but such groups are typically contemptuous of average Americans and their religious views.

This would set up a likely conflict, such as what we got a hint of when the masses elected Trump as a big middle finger.


Why is it that the politics of academics is found to be so distasteful, when we accept the politics of businessmen such as Peter Thiel and Rupert Murdoch as a matter of course? In a country that had always worshiped technological innovation, why are academics who provide the very seeds of this innovation, disrespected so easily?

Look where your governance-by-middle-finger got us. It got us an egomaniacal dotard, drunk on power, who shifted his politics when the opportunism suited him. His TV show got him just enough notoriety to move the needle among the masses. When he found that his toxic personality, based on name-calling and blustery bufoonery found a reception from Right-wing voters, there was nothing that could stop him from winning their side. But this was a man who could work with no one who wasn't willing to serve him without question, be they judges or legislators, and certainly a deal-breaker for our allies.

His world-view was an American centric one entirely, at a time when China had earned and demanded a role he couldn't conceive of or negotiate. Behind the scenes, he ruled a chaotic White House that was fortunately tested little. Still, the stories that escaped from this White House show that he was not to be contained by any legal constraints, and that he fostered an atmosphere where staff worked at cross-purposes to one another. Their singular focus was on the media, yet they couldn't even manage this successfully, despite the allegiance and sycophancy from Fox.

Finally, COVID-19 was the test they would not pass. Although Trump was due to receive some sympathy from his two impeachments, once America saw the reckless way he managed this pandemic and the huge toll it bore on us, there was little chance he could win the 2020 election. What we saw after the election was an unheard of nothing-to-lose strategy of trying to steal the election by any falsehood necessary, whether within the courts or in the streets. The books are coming out on the market now, but they show that our wildest dreams of what could have been happening in that White House paled when compared to horrid reality of what did.

We likely have to thank a mere few individuals such as Gen. Mark Milley, Mike Pence and such for holding together our country in the moments when it was the most vulnerable to those whose only concern was for power.


“ Look where your governance-by-middle-finger got us. “

Your??


First, lumping engineers in with the bulk of academia is disingenuous to say the least.

Second, what does it say about the desirability of what’s being sold by the technocratic elite that half the country voted for a second term for the egomaniacal dotard (for he surely was that), and most of the other half of the country got behind Joe Biden, a man from a working class background, a graduate of a second tier university known for his popular appeal rather than his intelligence?


The “politics of academics is distasteful” because they too often lack any basis in the lived experience of real people.

You point to Trump, but I’d note that not even Democrats like the academic social engineers. On Super Tuesday, Elizabeth Warren (a Harvard professor) had an embarrassing performance. Even in NYC, home of the country’s technocratic elite, voters passed over the technocrat in favor of someone who promised to carry a gun to church as mayor.


If you deplore the left-wing politics of academics because (perhaps) they live in an impractical, protected world, do you also deplore the right-wing politics of the clergy -- who also certainly live in a protected, isolated world? And what experience do the clergy have in the lived experience of 'real' people?


American clergy are a whole lot more diverse in their ideology than academics. Mainline Protestant clergy are often extremely progressive. Islam has a strong socialist streak, etc. They are also typically a whole lot more experienced than academics with people’s real lives and the problems of real communities.

Regardless, nobody is advocating having clergy run the country. (And no, listening to clergy as stakeholders in society and responding to voters’ whose beliefs might be rooted in religion is not the same thing as what the left seeks to do with academics. Putting a clerical “czar” in place to “nudge” people to desirable behaviors would be wild even for the Bush administration.)


That's called "sortition" [1], but currently not part of our democratic process.

[1] https://en.wikipedia.org/wiki/Sortition


> You've bumped into the exact same problem that exists with our democracy. Do you want our future determined by a few well-informed, rational people? Or do you want it determined by an ignorant, short-sighted mass?

Isn't this an argument for limiting the power we give to those in power? It's unrealistic to think that only people who are right will be in charge (I doubt I need to give examples). The realistic thing is strong controls on the power they wield, to that "short sighted masses" cannot take over. This idea underpins liberal democracy.


Yes, but the tricky part here is that "those in power" is far more than just those in government. As this very thread topic itself demonstrates, there are plenty of people and organizations that are not the government that have a large amount of power.

In a democracy, we have a way to limit the power of our government -- but what's the way to do that for those in power who are not in government? The traditional approach has been to use government power to check non-government power, but this seems to run counter to the whole "limiting the power we give to those in power." But if we don't use government power to check outsize private power, then we are again ceding too much power.

My own view is that the answer is less about how much power we grant than about how much accountability there is -- ie, broader and deeper democracy. But that seems to work mostly in theory; in practice this often gets circumvented by those with power and we get shallow democracy with limited accountability and thus unsatisfactory restrictions on both governmental and non-governmental power.


This is a very good point (it's my view that government has abdicated a lot of the power it did have in recent years and tech has filled the void, but that's another story). When I say "those in power", I mean the broad interpretation, not just government.

I agree that we need norms that limit any concentration of power. We've had anti-monopoly laws on the books for 130 years that serve a valid purpose. We may need to look at more laws to deal with recent constructs, I.e. platforms. I can see a superficial contradiction between not wanting government power, and giving government the power to break up monopolies or platforms. But if the overall goal is a restriction on how power can be concentrated, it's still in keeping with the idea of limiting what "those in power" can do.

I'm sure I've missed something. I definitely take your point.


Yes and this is why individual freedoms and local decision making should take precedence over collectivist thought, tyrannies of the majority, and top-down decisions.


The ignorant, short-sighted mass. That answer is kind of in the kernel of our civics. You're not supposed to have to think before answering that question. :)

(I mean those first two sentences non-ironically.)


> Do you want our future determined by a few well-informed, rational people? Or do you want it determined by an ignorant, short-sighted mass?

An elite in practice means that we're stuck with irrational beliefs such as Lysenkoism. Democracy allows beliefs to compete and often correct ideas win. It's far from perfect but still has a better track record than self-styled rational elites.

The following quote is from Neal Stephenson's In the Beginning Was the Command Line.

> But more importantly, it comes out of the fact that, during this century, intellectualism failed, and everyone knows it. In places like Russia and Germany, the common people agreed to loosen their grip on traditional folkways, mores, and religion, and let the intellectuals run with the ball, and they screwed everything up and turned the century into an abattoir. Those wordy intellectuals used to be merely tedious; now they seem kind of dangerous as well.


"The most abhorrent idea of all to me is that there are stupid, weak people that are worse than "us" that we need to shield from bad ideas because they are not smart enough (like we are) to handle them. Lots of support for this view in downstream comments (we're just intelligent apes etc)."

I couldn't agree more. While no-one can deny that we are all born with different capacities (intellectual or otherwise), the argument that some people are essentially incapable of taking care of their own lives is a very slippery slope and a cynical one at that, viewing humans as incapable of learning or improving. I find it truly repulsive.


IMO your stance basically amounts to "propaganda doesn't work". History would suggest otherwise.


Of course propaganda works. So use it to convince people of the truth instead of just muzzling them.


This post is pretty off-topic and unresponsive.

A central premise of Doctorow's argument is the "marketplace of ideas" metaphor. Proceeding from that framing, he proposes that the ACCESS Act as an important middle ground that could, among other things, help stave off the death of Section 230. OP points out that our better angels don't always win in that marketplace.

Let's get less abstract. The article envisions a market-based approach toward online censorship. The idea that a "marketplace of filters" would create a "live and let live" scenario seems extraordinarily naive to me. It seems much more likely that making it easier to choose filter bubbles -- and introducing a profit motive into the construction of such filters -- would just escalate the appetite to censor, on both sides.

> All this to say, we can be adults and let everyone get the same information and make their own choice, or we can go the other way where someone who thinks they know more than us

You seem to be confusing critique with a proposal of some alternative. Just because OP believes that the "marketplace of ideas" thesis is a farce and that the ACCESS Act is probably a non-solution to the wrong problem, doesn't mean OP wants to censor people.

It's possible to both believe that Doctorow's argument is wrong and also not be a full-throated supporter of aggressive censorship.

In fact...

> The most abhorrent idea...

You take rather flagrant liberties when interpreting parent's post, constructing an unrecognizable strawman out of a post that contains phrases like "Is censorship the best way to deal with it? Probably not." and "And before anyone asks who I am to judge the best ideas, I'll answer - nobody".

> The strawman of...

Those in glass houses...


> The most abhorrent idea of all to me is that there are stupid, weak people that are worse than "us" that we need to shield from bad ideas because they are not smart enough (like we are) to handle them. Lots of support for this view in downstream comments (we're just intelligent apes etc).

What you just described is called a representative democracy and was essentially created because a direct democracy didn't work, for exactly those reasons you mention (many people can't actually think for themselves - at the political level).

> All this to say, we can be adults and let everyone get the same information and make their own choice

You're completely ignoring the crux here - it's about both access to information and algorithms that present signals vs noise. Pretending that my 95 year old grandma can reliably decide that a fox news segment isn't sensationalized drivel after being primed for 30+ years that its "news" and not entertainment, is just as silly as you thinking this whole suggestion is ridiculous.


The approach of “people are gullible and need someone smarter to show them the truth” of course requires some “smart enough” and no doubt many HNers consider themselves those people. Which is a hilarious take on HN if you ever read a discussion on health matters.

The amount of bad information, random conjuncture and conspiratorial thinking is pretty shocking.


I did not interpret their comment that way. I see part of the problem being that the spaces we hang out in now are not conducive to productive conversation with a variety of people and instead nurture division due to perverse incentives and poor design.


Why is it ridiculous? You don’t respect the potential injustice which can occur due to disparity in information and/or knowledge. Think about the popularity of crypto. How can you think your position of complete lack of regulation is moral? You think it’s okay to allow the targeting of the economically vulnerable to continue without restrictions via ads or - my favorite - celebrity endorsements?

This is ridiculous.


It's not clear to me that censorship isn't making it worse. You have a mass populist/anti-elitist movement in the country. You're telling me that a solution to this is to have elites at multi-national mega-corporations deciding what information should reach the unwashed masses?

What exactly is the end-game here? You're not going to beat (or educate) the populism and conspiracy theories out of Americans. We're talking about a country where a prominent tech billionaire died because he disregarded "the medical elite" and went on an all-fruit diet to treat his pancreatic cancer.[1] That river is wide and deep, my friend. Telling people that the elites are controlling information sharing online for their own good is unlikely to have the desired result.

[1] https://www.forbes.com/sites/alicegwalton/2011/10/24/steve-j...


> You're telling me that a solution to this is to have elites at multi-national mega-corporations deciding what information should reach the unwashed masses?

No, I’m specifically not telling you that! I’m not sure I have any good solutions but I specifically called out that censorship probably wasn’t the best plan!

I just wanted to say that the concept of a marketplace of ideas, as commonly envisaged in terms of ideas with merit rising to prominence, seems to be naive, and when we’re building systems to facilitate an uncensored or uncensorable marketplace of ideas, perhaps we should keep that in mind.


"According to the American Cancer Society, for all stages of pancreatic cancer combined, the one-year relative survival rate is 20%, and the five-year rate is 9%." (https://pancreatic.org/pancreatic-cancer/about-the-pancreas/...)

I would hesitate to say your prominent tech billionaire died because of diet choices.


He had a highly curable type of pancreatic cancer.


This is absolutely incorrect, he had islet cell neuroendocrine cancer, which is a much less common form of pancreatic cancer. "Highly curable pancreatic cancer" is a totally ludicrous statement.

The prognosis of these type of tumor is highly variable and he lived a pretty long time with it regardless.

He did not 'disregard "the medical elite" and went on an all-fruit diet', he had the tumor removed[0] nearly 20 years ago, and only after years of the most advanced medical treatments in private did he finally die. Steve Jobs lived for seven years after the tumor was removed, meaning he exceeded the 5-year survival rate.

Sadly, if somebody had stage 4 pancreatic cancer in the year 2011, the only cure for the disorder that will follow their untimely passing would be to soberly arrange ones affairs and express their humanity to everyone they have ever cared about.

[0] - https://www.sfgate.com/news/article/Apple-s-Jobs-has-cancero...


“The 5-year survival rate for people with pancreatic NET that has not spread to other parts of the body from where it started is 93%.”

Jobs delayed aggressive treatment for close to a year. He finally had the Whipple surgery, but it reoccurred likely due to spreading.

Well never know what might have been but suffice to say he only shortened his life.


The "highly curable" is relative to other types of cancer.

While Jobs did undergo surgery, he waited 9 months and didn't get surgery until after the cancer had spread. I don't see amy credible medical claims that this dis not have an impact on his survival.

> But Jobs refused surgery after diagnosis and for nine months after, favoring instead dietary treatments and other alternative methods...By the time Jobs finally opted for surgery, the cancer had spread. [0]

[0] https://www.forbes.com/sites/alicegwalton/2011/10/24/steve-j...


This is at the very least a reasonable description of the actual events.

But even now that we have acknowledged "highly curable" to be relative to other types of cancer, pancreatic cancer being among the most sinister and deadly, I take great issue with the notion of highly curable pancreatic cancer.

  By the time Jobs finally opted for surgery, the cancer had spread. He had an under-the-radar liver transplant and began putting a lot of energy into researching the most sophisticated experimental methods, making a complete about-face from how he began his treatment years before.
  According to the New York Times, Jobs was one of the few people in the world to have his genome sequenced. Collaborating researchers at several institutions sequenced his DNA in order to develop a treatment that would target his specifically mutated cell pathways. He went for an experimental treatment in Switzerland in 2009, which involves using a radioactive isotope to attack the faulty hormone-producing cells of the body.[0]
This story is so much more about anger, denial, bargaining and the grieving process than it is about an (admittedly and certainly) aloof and incredibly powerful figure who was flagrant in the face of medical authority.

A previous romantic partner of mine's father was diagnosed with pancreatic cancer and made startlingly similar choices against the behest of his entire family and everyone he knew. He was surely not an aloof billionaire who ate nothing but fruit, he was a very pragmatic union carpenter who took great pains to manage his health.

I know you did not make a claim like this, but I personally don't find it at all unthinkable for a somebody to make this sort of choice when faced with this diagnosis.

[0] Same, https://www.forbes.com/sites/alicegwalton/2011/10/24/steve-j...


Fair enough. A (probably weighted average) prior on his pancreatic neuroendocrine tumor puts the 5 year survival rate at 54% instead of 9%. [1] It appears there are more pancreatic NET cases in the least favorable, "distant" SEER classification.

Nevertheless, there can be more factors at play than distrust of the medical community. Some people may want to live out the rest of their lives free from the harsh side effects of chemotherapy, for example. I wouldn't use his cancer treatment decisions to argue that tech elites make bad choices in the face of information, to bring us back to the broader discussion.

[1] https://www.cancer.org/cancer/pancreatic-neuroendocrine-tumo...


That's because the "marketplace of ideas" is being distorted by algorithms that are going to favor "provocative" ideas that drive clicks. Exciting lies defeat boring truths.


That tells you that "marketplace of ideas" is a dumb idea, because we already have proved a million times that people are not the rational creatures that we were sold we were for a long time. We are smart apes, but apes none the least: emotionally driven by our limbic system, very tribal and status seeking. Prone to rationalize all these basic impulses in a million different ways creating the illusion of rational individualism. The algorithms just shows us our true nature, don't make the mistake of shooting the messenger.


There is a marketplace of ideas. That's not anything that is ever going to change. Concepts spread and are accepted or they don't.

The question is only how. Is it going to be by dictate? Is a church or government or institution going to enforce which ideas spread, or is it going to be free, spreading at the level of the individual?

There is no moral alternative to the free exchange of ideas. The only alternative is a restricted exchange... censorship and punishment.


> or is it going to be free, spreading at the level of the individual?

And what are the consequences of that?

When people talk about the marketplace of ideas they tend to think of good, beneficial, ideas winning out over foolishness, conspiracy and just plain inaccuracy.

This has been shown up as naive lately. And if we really are talking about the "marketplace of ideas" being raw, Darwinian, strongest wins, then are we prepared to deal with the fallout of the fact that loud, motivated, well funded bullshit merchants will continue to have a huge grip over the public dialogue?


I feel you are possibly too cynical about the ability of people to (eventually) figure things out.

Because something is posted on the internet doesn't mean large numbers of people actually believe it.

There will always be "wrong" people. That will never change. But there is a degree of wisdom in crowds.

Proper education and "good" cultural values are extremely important and we haven't done so well instilling those lately.


Censorship has also turned out recently to be a bad idea. Neither solution is perfect but since censorship has the basis to destroy information we should choose the less dangerous path of “do nothing”. What is so dangerous lately that has you concerned?


Or that a few algorithms created by a few companies does not constitute anything remotely resembling a marketplace.


People can choose to get their information from places other than the large platforms. To me, the "marketplace" is the internet as a whole. If people choose to stay on the platform, is it not similar to choosing CNN and not ever looking at Fox? In the end, people have many options but choose to limit which options they engage with


There’s a section in the EFF on why not being on the large networks is a detriment.


The usual idea of a marketplace is preserving people's freedom to go to alternatives. Yes, sometimes that freedom has to be preserved via intervention to preserve choices (e.g. antitrust laws), but it's always weird to me to see the critique used as a reason for less freedom rather than more.


Also the shock "wrongthink" things we see on the internet might not be a representative sample of what the general public actually believes.


The marketplace of ideas works. It may not work in the timescale you want or with the tradeoffs you want while the truth settles out, but I think these claims that it doesn’t work are hasty.

Furthermore there isn’t a better alternative. Right now censorship from platform owners is abusively used to skew political discussions, using the pretense of factuality. I regularly see fact checkers make mistakes or make misleading claims or they inconsistent apply scrutiny. I see trusted organizations like old newspapers and health organizations (like the WHO) regularly make mistakes or sell their own speculation as incontrovertible truth. Trust cannot be given to a few lone entities. The marketplace of ideas doesn’t have this problem since it is decentralized.


[flagged]


How can you say that with regards to tobacco? It's all but illegal, completely banned in most public places, you can basically only use it in your home or outdoors alone.

"Not on the timescale you want" is the salient point.


If the system actually worked tobacco would actually be banned and not almost.

Despite being only able to smoke outdoors I still see many young people in Spain taking up the habit.


What if people don't share your priorities and would rather smoke despite the health consequences. Where does this line of thinking end? Why should someone else get to tell us how to prioritize pleasurable activities vs health and longevity.

The argument about misleading advertising I understand, and if you'd said "advertising portraying smoking as safe would actually be banned" then I think there's a clearer argument (despite my being uncomfortable with the idea of "misinformation" being targeted, we have lots of reasonable precedents for statement you can make about products you sell). But legislating what people's health priorities should be is authoritarian and not a power government should ever have.


Why should tobacco be banned? That's just another war on drugs that didn't work for alcohol or pot, and isn't working for the current illicit drugs. People should be allowed to smoke what they want, as long as it isn't a health issue for others. If young people in Spain want to smoke outside or in their homes, so what?


Nobody starts smoking because they like it. They get roped into it by others.

Pretty much everyone I know regrets they ever started. That's the difference. Many other drugs are actually nice to use.

I don't want to ban it for those old people that have smoked all their lives. But that young people still take it up is worrying.


But timescales actually matter. Without any expectation of a specific time scale, you can claim that anything is actually working perfectly and just hasn’t worked yet.


That tobacco is "all but illegal" now (in the US - the same exact companies continue their murderous strategies in developing countries) is a bit late for all the people who lost family members in slow and horrifying agony.

Their execs kept the money they made.

Their model was so "successful" that it has been studied, copied, and extended by the other industries I named - with global and irreversible consequences.

As was put so succinctly above; timescales matter. Oceans don't give a fuck what I want, they won't de-acidify themselves no matter how much I cry about the marketplace of ideas/


If the algorithm distorts conversation, then how can you be sure it more accurately reflects human nature?

How does something that isn't a marketplace of ideas prove it's dumb? Or should this question be punished by your algorithm?


The main agenda the algorithm has is selecting stuff human nature has a high propensity to click on...


Kind of like drug cartels having the agenda of producing drugs that human nature has a high propensity to be drawn to?

In the face of hyperstimuli, we can't talk about mere human propensities as a justification, else we would have already turned ourselves into junkies many times over.

How many years did it take for heroin go from a cough medicine to a schedule I drug? Amphetamines, cigarettes etc.

There is a reason you can't put simply everything on the marketplace, and that is OK. That doesn't mean the idea of a marketplace is broken.


Alternatively, it simply makes it easier to click on the presented content than to purposely seek out other content. Front pages and headlines dominate media for a reason.


That's an assumption which critics of these algorithms would disagree with.


That may well be a big factor, and that's something that needs to be looked at more closely. Lots of people describe being drawn down the rabbithole by algorithmically driven suggestions which nudge them ever more extreme.


I have thought this for a while. Algorithms are something entirely new to human discourse.

Apparently I watched some alt-right-adjacent podcast or something, so for weeks youtube kept recommending seriously racist, fascist, white nationalist content. The algorithm was actively trying to lead me down the rabbit hole. Obviously it's learned that doing so increases engagement.


The lab-leak hypothesis is provocative (and probably true), but got smothered anyway. It was just inconsistent with the narrative.


You believe in the narrative?


There was quite a public display last election about news orgs agreeing to not cover Biden in negative light so as to not hurt his chances. What would you call this?


I don’t know, newspapers support one party or another. The narrative with emphasis makes you sound like a nutjob.


> I don’t know, newspapers support one party or another.

Yea that’s one, extremely light-footed, way to explain it. It doesn’t cover all left biased news sounding near identical though nor does it cover the dangers of group think mentality which we see manifesting now.

> The narrative with emphasis makes you sound like a nutjob.

That's a pretty petty thing to have issue wouldn't you say? But ok, dems now say using italics makes you a nut job, lets ban italics now.


Same could be said about right biased news, and this has always been like that (both sides of spectrum) since the invention of newspapers basically.

I'm not American, hell I can't even remember which of democrats/republicans are more or less conservative - or which animal logo is which.

What I'm saying is, that using the narrative makes you sound like a conspiracy nutjob that insinuates that there is some conspiracy going on.


> Same could be said about right biased news, and this has always been like that (both sides of spectrum) since the invention of newspapers basically.

How come dems always revert to whataboutism?

We’re not talking about biased news. We’re talking about concerted efforts by multiple groups of people to create a single unified voice for their party.

Also do you get that you can not be a dem and also not be a republican?

> What I'm saying is, that using the narrative makes you sound like a conspiracy nutjob that insinuates that there is some conspiracy going on.

Oh it’s no conspiracy, it’s a unified party fighting what they view as an enemy.

The nutjob thing is just stupid already. In fact you’re starting to sound like one for having such and issue with it.


This is clearly not whataboutism. I’m not gonna debate why, I leave that as either an exercise for you, or tale it as a trolling attempt.

> you can not be a dem and also not be a republican

False dilemma.

> Oh it’s no conspiracy, it’s a unified party fighting what they view as an enemy. The nutjob thing is just stupid already. In fact you’re starting to sound like one for having such and issue with it.

Sure sure


> This is clearly not whataboutism. I’m not gonna debate why, I leave that as either an exercise for you, or tale it as a trolling attempt.

https://en.wikipedia.org/wiki/Whataboutism

I leave that as an exercise to you. This quite literally fits the definition.

Agree with the rest or not, lets see what happens in 2022! Dems keep digging a hole for themselves acting this way.

Also, speaking of trolling, remember you took issue with emphasis and still haven't disproven that the narrative doesn't exist. So this entire post chain is a troll attempt by yourself.


It didn’t get smothered though. Since the early days of the pandemic I haven’t seen any discussion of the pandemic of any appreciable length that doesn’t contain mentions of that hypothesis.


That's a case of the "marketplace of ideas" routing around the attempted suppression by the large media organizations and government. I don't think you would have read any mention of it (besides condemnation) from any of them, until the last few months.

Geez, I think that paragraph is completely true, but it sure sounds like conspiratorial nonsense. The state we're in...


It seems to me that it has been a popular meme since the start of the pandemic.


It seems unfair to deprecate algorithms which "favor 'provocative' ideas that drive clicks" as a `distortion` of the marketplace of ideas.

It seems more that by driving clicks, the algorithms are `facilitating` the marketplace. People are signaling the ideas they prefer, idea suppliers are producing more of such ideas, and the people get these ideas at ever-lower prices.

How is that in any way a distortion of an ideal marketplace?


There is no ideal marketplace of ideas.

It takes years to fully develop an understanding of any particular ideology. How are you supposed to judge which one is better at interpreting the world on such a short term system?

There is simply too much friction in the ideological marketplace for it to approximate even loosely an ideal market.


It's an ideal "twitch" marketplace, not an ideal "deep thought" marketplace. "The marketplace of ideas" wasn't usually considered to be about gossip, rumor, or entertainment. Those are... something, but not "the marketplace of ideas".


Provocative ideas are not the same as good ideas. These algorithms artificially promote provocative ideas because that drives revenue. That's a market distortion, just like subsidies distort economic markets.


I actually think algorithms are more likely to promote things that either reinforce your existing beliefs or incite outrage.


That is an idea extremely unlikely to be contested, given what we've known for a few years now.


I don't think there is a marketplace of ideas that can be separated from the algorithms used to promote them.

Every ordering of content is going to favour something, even chronological.

Somebody will find a way to abuse any sorting mechanism you use


Give the user the driving wheel of the recommendation or open up the algorithms. It's not that complicated.


Speaking of, I wish hn would hide upvote counts. If I wrote something provocative and stinging here against social media, it would be karma city, and it feels good to see your number go up. Thats not a temptation I like


That is no doubt true (about how the algorithms work). At the same time what we see is that there is a market, demand, for misinformation. And some people are making money from that.

Why is there so much misinformation about Covid vaccines and vaccines in general? Because people want to believe in such misinformation and because that allows websites to make money from their readership. Therefore the peddling of misinformation continues.

It is not so different from what publications like National Inquirer and Weekly World News have been doing for a long time. But instead of paying for National Inquirer you can now get such misinformation for free on the web, because you are paying for that by spending your time seeing the advertisements. Information wants to be "free" whatever that means, but people want misinformation, to give them the feeling they are right about their prejudices.

SEE: https://fortune.com/2021/05/14/disinformation-media-vaccine-...


Because there are sociopolitical actors who can extract direct benefit from OTHER people believing misinformation to their detriment. You are omitting this motivation, and it's a significant factor, worth spending money and energy on if your motivation is to harm a population that you see as your enemy.

This assumes a global marketplace of information propagation. That's the new factor here: the actual motivation isn't all that new.


I wonder how prevalent the will to "harm a population" is. I think it's more about benefitting oneself financially than trying to harm others. Politically speaking yes politicians try to harm (the popularity of) their opponents. But politicians are a very small slice of the population.


> Weekly World News

This has always been obvious satire.


Yeah, well the Chupacabra still keeps me up at night.


> factually incorrect, scientifically illiterate, conspiratorial, harmful crank ideas gain legs in the online world we've created

I don't think this is unique to the online world. It's just how humans are. We are drawn to sensational, scandalous stories for some combination of entertainment and self-righteous outrage/virtue signaling to the others in our tribe.


> We are drawn to sensational, scandalous stories for some combination of entertainment and self-righteous outrage/virtue signaling to the others in our tribe.

While I agree, I think there's more to it than that, I think it's more serious than that, and I think the message amplification capabilities afforded to profit-driven (or ideologically driven) bad actors is something we've not really seen before. Certainly not at this scale.

So while it is human nature, that's not to say the outcomes are desirable, nor that the marketplace of ideas concept is consistent with reality. In fact I think it might be the point, much like the perfect economic market, the concept of the well-functioning marketplace of ideas cannot exist, because of the humans that make it up.


"major platforms’ amplification features have also caused or contributed to real damage in the world. At a societal level, they have spread misleading political material, to the detriment of democratic governance"

https://knightcolumbia.org/content/amplification-and-its-dis...


The Catholic Church felt the same way about Gutenberg's press.


Gutenberg's first mass-produced book was a Latin Vulgate bible. The 40 remaining copies are among the most valuable books in the world. Gutenberg also printed indulgences for the church. It's fair to say the Catholic Church absolutely loved Gutenberg. https://www.newadvent.org/cathen/07090a.htm


>It's fair to say the Catholic Church absolutely loved Gutenberg.

I wasn't talking about the man, I was talking about the machine. The Catholic Church certainly didn't love it anymore once The Reformation happened. See the Catholic Church could no longer control information when lay people could read the Bible, and Martin Luther's texts. Gutenberg's press completely transformed the church's control in Europe and it lasts to this day.

https://speccoll.library.arizona.edu/online-exhibits/exhibit...

This has parallels today. The government can no longer control information via traditional media channels because the internet, in its current form, exists. They want that control back. I believe this is an attempt to regain control. The government would never shut down big tech companies, they are a great source of information for data collection.


I knew you were going to bring up the Reformation. The printing press, by that point, was something anyone with sufficient resources could obtain. The Catholic Church was certainly not lacking for resources. Martin Luther and his allies didn't have access to non-traditional media channels. Everyone was using printing presses, the Catholic Church just wanted to have the last word on who was allowed to use printing presses.

Interestingly, in 1644 John Milton wrote an impassioned philosophical defense of the principle of a right to freedom of speech and expression. He wrote it response to the requirement, of the Protestant government, that all authors be licensed and approved by the state. In that defense, he wrote, "Yet if all cannot be of one mind—as who looks they should be?—this doubtless is more wholesome, more prudent, and more Christian, that many be tolerated, rather than all compelled. I mean not tolerated popery, and open superstition, which, as it extirpates all religions and civil supremacies, so itself should be extirpate"

https://www.gutenberg.org/files/608/608-h/608-h.htm


>Catholic Church just wanted to have the last word on who was allowed to use printing presses.

Which is the way government is treating big tech, government wants the last word on "truth." Sounds like you are agreeing with my analogy.


I agree that the outcomes are not desirable. Many people over history have recognized this, and that is why we have religions that recognize our inherent flawed or "sinful" nature in this regard and give us a framework of rules for how to live. Look at the "Seven Deadly Sins" just as one example, it's pretty much what the social media algorithms select for when they promote content.


It is not unique to the online world, but the online world does change the dynamics around these things significantly, because the social immune system that we have in the real world completely fails on the internet.

In a small, real world friend group "that guy" who talks about ancient aliens, flat earth and Pleiadians or whatever gets shut down real quick and suffers consequences within his social circle.

In the online world, "those guys" can find each other and egg each other on into more bizarre world views. This isn't a theory, you can watch this happen in real time with the qanon "movement".

People who (rightfully) feel estranged by mainstream media or left behind by politics pick up some weird idea, find communities who agree with them and end up holding on to those views longer than they would without the support. In some cases, this leads them to slowly becoming more estranged from real world contacts. Family and friends distance themselves because they don't want to hear any more about how Earth being a globe is somehow a big conspiracy and this little supportive online community eventually remains the only safe space to talk openly and they become more invested in it.

The problem is these communities harshly punish everyone who disagrees with the mob. Increasingly outlandish ideas are pushed into the conversation, and your only options are to agree or to be turned away, which becomes increasingly painful the more other contacts withdraw from you. It's a death spiral that is scary effective.

Eventually the only people that you can relate with is a group that also talks about how every bad thing that happens is orchestrated, democrats are all traitors that need to be put in front of a firing squad and baking soda treats cancer.

We need to come up with something to fight this deterioration of the social fabric, but I agree censorship is not sufficient or maybe even helpful.


> Better ideas, better models of truth and reality don't win out. People have not shown themselves interested in intellectual debate or argument.

This is plausible, so people believe it. Fortunately, it is false. Compare the state of the world regarding germ theory, human rights, smoking, racism, education, the belief in violence to solve problems, or the delusion that is theism to the world 100 years ago.

It used to be way, way worse. The better ideas are winning.

Things are improving tremendously. It just takes time.


But are those ideas winning because people are rationally chosing to believe in them in the marketplace of ideas, or just that we grew up with them and the people with contrary beliefs are dying out.

There is a saying that "science advances one funeral at a time" because even professors etc., one of the most rational group of people on the planet I would think, have biases against new theories despite evidence supporting them, instead sticking to what they already know even if it would be disproven.


Those are all pre-internet. Not sure they are good examples of how the marketplace works in social media.


"Often at the root of these is a profit motive."

Big Tech is DOA if the web is not open for commercial use, i.e., advertising. But the web is definitely not DOA if all advertising ceased. Look at the enormous growth of the internet, the vast number of users with internet subscriptions today, billions of them using the network on a daily basis for a variety of non-commerial uses, hundreds of millions uploading content for others to consume. (Big Tech middlemen pervert this recreational usage for their own commercial uses.)

The internet was not created for the purposes of advertising. (There was none in the beginning.) That is only one use. Look what happens when we allow ads without any rules. Yikes.

Even if advertising were regulated, the web could still be used for commerce, e.g., processing commercial transactions.


> factually incorrect, scientifically illiterate, conspiratorial, harmful crank ideas

There are indeed some factual mistakes floating around, mostly around covid-is-just-a-flu and the-earth-is-flat discussions.

But usually shit hits the fan in social issues of poverty, homelessness, abortion rights and such.

Alas, there's no hard science in most of these issues. There is a bunch of plausible-looking speculations, philosophical theses and general thoughts. Pretty much like 17-th century physics, in 18-th century biology or 19-th century medicine were.

As for the ideas being harmful... What do we do with potentially harmful, yet factually correct statements, and does the same logic apply to beneficial, but factually wrong ones?


I don't know, I genuinely don't have a good solution - I'm pretty sure "ban it" is not one.

But I think we need to recognise, when we try to address this stuff, that our current picture of speech online is suffering not just from deplatforming, but also domination of some narratives by high volumes of motivated bullshit.

> usually shit hits the fan in social issues of poverty, homelessness, abortion rights and such.

flat earth is .... well I wouldn't put it with the most harmful ideas. It's clearly very silly, and not a great thing to propagate, but so far flat earthers haven't taken their beliefs out on the road, so to speak. Antivax, now that's harmful. There are certainly all sorts of grey areas and I wouldn't want to suppress discussion of safety, of risks from emergency approval or whatever, but when you get to "The vaccine makes you infertile" and "The vaccine will kill everyone that gets it" and "The vaccine contains 5G chips that they are going to use to track you", and such views actually start to impact the uptake...


> and such views actually start to impact the uptake

I’m not sure that’s how the chain of causality works. We have 80s-style public health communication that isn’t effective today. People are looking for a reason to reject it. If not the 5G chip stuff, it’d be something else.


I think this just reinforces that the marketplace of ideas is a bust... personally. Because again, people aren't looking to evaluate rationally.


People are turned off by appeals to authority logical fallacies.

Trying to make them the only acceptable source of information is never going to sell.


I'm afraid that people are also turned off by facts that go against their preconceived notions, and people are predisposed to accept, with little question, authorities with whom they already identify.

In fact I'd go as far as saying your argument there is a logical fallacy, as I'm making no claim people should accept appeals to authority.


I don't think antivax conspiracy makes a lot of impact.

Current vaccination rate in the US is 55%.

This is comparable with the number of people who, for example, do a dental check-up at least once a year, which is 60%, although there's no dentists conspiracy theory, at least not such a popular one.

P.S. those downvoting, please feel free to, but I would highly appreciate a couple of lines on what's wrong with the message? Pure curiosity.


The difference is dentists in the US cost money (unless you have a job with dental insurance which many low-end jobs do not). People who don't go to dentists regularly probably can't afford it. The Covid-19 shot is free (at least in the US).


Agreed. I doubt that half the population thinks that the COVID vaccine is a conspiracy to kill them, render them infertile, or depopulate the earth.

I'm not vaccinated, and I don't believe those things. I do see the dentist every 6 months though.


Tribal politics is probably a better predictor than fringe theories.


I have a friend who's gone off the deep end in the past year (like so many others).

He freely admits that he doesn't believe the vaccine is bad, and might even be good. His sole reason for not taking it is, and I quote, "because they want me to." In other words, simple spite.

We're not talking about a teenager here, but a successful man in his 60's who is probably the single most well-read person I know.


You see no value in the instinct to resist conformity? All major social and scientific progress has been made by someone with that kind of mindset.

You could argue that it's misplaced in this case, but that's already a more charitable interpretation of anti-conformists that will serve you better than just saying they've "gone off the deep end".


For good reason we don’t accept people yelling “fire” in a movie theater. Anti-vax misinformation during a pandemic is definitely in the same vein as yelling fire in a theater.

There are a few solutions. Don’t know that any of these are possible, but sometimes solutions only get found in a listing of possible solutions.

Any sourcing of the internet as a source of truth could be called into question on face.

Or massive platforms could find a way to be less massive. Ie “we’re just not going to host governmental agencies or politicians.” Also bracketing the scope of posts and information on social media.

Or who knows.

Edit-one observation I’ve had is the internet is best when tied with another source of truth. Ie you can generally try recipes from a website and tell whether it makes palatable food by just making and tasting it.


>For good reason we don’t accept people yelling “fire” in a movie theater.

This meme persists, because it sounds nice, but the phrase is from a Supreme Court case where they ruled you can't distribute anti-WWI-draft flyers. Not exactly the precedent to be invoked, here. It was overturned (in part) to define that criminal speech under the first amendment is only speech that is determined to incite imminent lawless activity.

>Any sourcing of the internet as a source of truth could be called into question on face.

Which is why banning "misinformation" is a terrible idea.


In a well-run theater yelling fire should lead to a prompt and orderly evacuation.

It's called fire drill.


> Better ideas, better models of truth and reality don't win out. People have not shown themselves interested in intellectual debate or argument. We do not see the 'best' ideas rise to the top.

This has been well known for a couple thousand years at least. Aristotle comprehensively covered it in his Rhetoric[1]. There have always been people who can only be persuaded by rhetoric and not by dialectic. If anything, our contemporary educational system is designed to form persons to be incapable of dialectic and susceptible to rhetoric. And so the current state of affairs is unsurprising.

It's strange to think that rhetoric used to be a standard secondary level subject in the West. I wonder what it would be like to live in a society where virtually every moderately educated adult were well versed in persuasion tricks.

[1] https://plato.stanford.edu/entries/aristotle-rhetoric/


Are we sure that most, or even many adults are well-versed in things they were taught in school? My experience suggests otherwise. I would honestly struggle with trigonometry or biology today, both subjects I excelled at in school. Funny enough, in my facebook feed today there are numerous people failing to answer "50+50-25*0+2+2" correctly.


Its less a "Marketplace of Ideas" and more a "church or private establishment" on the network. Even HN isnt spot on sometime bad ideas do rise up when they it appeals to peoples emotions more than sensibilities.


Social media (and the internet as a whole) are shaped by and promote the values of those who create it. They allow certain interactions and not others. (to what extent are the creators the leaders of the social media companies, vs. the users?)

https://firstmonday.org/ojs/index.php/fm/article/download/51...


> We do not see the 'best' ideas rise to the top.

It's the same with the economy. Not the best products win, but the ones with the biggest advertising budget.

We should try to dampen the effect of these counterproductive forces.


Glad to see your very good idea rise to the top.

And this is mostly because HN, partly though the excellent work of dang, does, well.... kinda the same thing. HN is censored. And I approve.... that's why I'm here rather than somewhere like 4chan.


Ha, perhaps the success of the comment disproves its content!


Missing from this discussion is the understanding that speech is an inalienable right.

It does not need to be justified on consequentialist grounds, nor on the basis of the more pernicious metric of harm reduction.

Pointing a gun at someone to silence her is a form of censorship, and so is quietly erasing her from the most prevalent communications channels (deplatforming).

Figure out a way to achieve harm reduction without grotesque violations of natural rights. Until then, there will always be people who prefer dangerous speech to the safety of slavery.


The internet has been a beautiful experiment to find out how what crank ideology aligns best to the human default instinctive nature.


It's not good to both observe that crank ideas rise to the top due to the profit motive, and to think the solution could be that the people who are best at profit should get to censor speech.

Maybe we attack should sleazy commerce rather than speech?

The reason people go in for crank ideas is because they recognize that the people who sell them things are constantly manipulating them. They work so they don't have time to investigate everything themselves, so as a proxy many tend to believe marginal people who they don't know at all due to the signal that their speech is being suppressed by known liars who are never called out because of their power.

Meanwhile, the upper middle-class people who provide the infrastructure and strategy for the biggest frauds are smug in the belief that their betters are being honest about what is true and false, because their income depends on it.

The problem with people is that they are unmoored, with absolutely no sources of information that are not trying to squeeze cash out of them. They come up with sketchy heuristics to give them some semblance of stability between shifts at work and climbing pointless complications in their lives created by rent-seekers.

The problem with the comfortable upper middle-class is that they are too moored, too sure they're at the end of history. Too sure that they know what is true between stitching together half-remembered NYT and WaPo headlines with their discussions with each other at restaurants and dinner parties. Too sure that truth can and should be dictated by people who have a better degree than they do. As if maintaining that comfort is not an interest, as if bias towards themselves as "the middle" is actually the definition of being unbiased.


It's about the time constants of communication that cause this. Given time, the truth wins out. But when you accelerate communication beyond human capacitor to process it, you get what we have today.


> Better ideas, better models of truth and reality don't win out.

These are just subjective judgements. All you're saying is that people you don't share the same value judgements with won out.


How do you explain the massive list of advancements humanity made during the Enlightenment period with this attitude? Seems like you're advocating another dark age.


> I can see that factually incorrect, scientifically illiterate, conspiratorial, harmful crank ideas gain legs in the online world we've created.

The fact that you phrase your characterization of incorrect ideas like this suggests with pretty high certainty that you don’t exactly have a great birds’-eye view of the epistemic landscape yourself. People who characterize the “opposition” this way mostly get their ideas from NPR-tier Pravda publications, and aren’t exactly better suited for picking out “better models of truth and reality”.


I don’t characterise “the opposition” this way. I characterise some of the more prominent ‘fringe’ stuff as this.

I don’t really have an ‘us’ for there to be a singular ‘them’ I’m talking about, I’m not American and I don’t identify with any particular political party.


The fact that you equivocate NPR and Pravda means you’ve huffed plenty of those fumes yourself.


> Voting Republican is to enact fascism and tear down the tenants of democracy in the US

You seem like a skilled political analyst, so I defer to your expertise. One correction - it’s spelled “tenets”.


like every marketplace, what matters is the entrenched interests in the market place and the amount of space needed for a new market.

when the "too big to fail" market is filled with entrenched interests and bloat themselves to take up most real estate, then ideas only compete if the existing ideas accomodate them.

this is where we derive the "Uber for X" marketting speak. as such, the market


I think you could make a basic content richness argument. There are interesting nuanced left, right, libertarian, socialist, etc. ideas but none of these are the ones discussed. The stuff that gets popular is meme level trash. Social media selects for punchy sound bites as much or more than the old media it is replacing.


“But the peculiar evil of silencing the expression of an opinion is, that it is robbing the human race; posterity as well as the existing generation; those who dissent from the opinion, still more than those who hold it. If the opinion is right, they are deprived of the opportunity of exchanging error for truth; if wrong, they lose, what is almost as great a benefit, the clearer perception and livelier impression of truth, produced by its collision with error.” -- John Stuart Mill couldn't disagree more.


You're using the quote of someone who lived and died before the invention of the telephone, and also ignoring that he also advocated for a 'harm principle' (https://en.m.wikipedia.org/wiki/Harm_principle) which states "...that for such actions as are prejudicial to the interests of others, the individual is accountable, and may be subjected either to social or to legal punishments, if society is of opinion that the one or the other is requisite for its protection."


So you think we should completely ignore anyone who has said anything of substance before any arbitrary technology was invented or if they have any other opinion that you disagree with? We can safely cast theses people's opinions out as invalid? Is that not exactly what you are implying? Do you not see how absurd those implicit arguments are?

That's not to mention the fact that you are ignoring the amount of influence that man had on our political and economic systems which benefit your life in ways you probably haven't considered...


It is this attitude which appears to be based on an idealistic assumption that people care about exchanging error for truth.

This is what I am arguing is not really the case, as demonstrated amply by the world around us.


Take what you support and imagine it used against you in the worst possible ways. Do you still support it? If yes, then it is worth supporting. If it's not, then maybe you're only supporting it because of who it currently helps or harms. This exercise also works with things that you don't support.


This applies to policing in general of all kinds? Isn't this literally an "abolish the police" argument? After all, imagine the police used against you in the worst possible way ..


Surely one can conceive of effective ways to limit to damage the police can do to people short of abolishing the police. For example, disarm the police and they can’t shoot you.


Well, yes, that's kind of my point; strawmanning all sorts of scenarios distracts from taking a look at what's actually happening and creating meaningful checks and balances.


You still are paying the same under your plan.

You abolish them and then move that money/power into some other entity like social services until the power / money corrupt them.


As long as it's legal for them to kill you because they feel like it, not having a gun doesn't change that.

Their knees are just as deadly, but I don't think taking away officers knees is a good next step after their guns


Or just have public audits of police forces and provide accountability and transparency into cases as necessary.

In addition good policing should be rewarded with significant bonuses.

Incentivizing good behavior will always Trump criticizing "bad" (often never defined) behavior.


I personally would rather see disciplinary action akin to what's done in the military. Not that that isn't without its issues, but the machinery that makes it work is larger and more independent from the people being disciplined than what you get at the state and local law enforcement level.


Right now you have a damned if you do damned if you don't situation where police get 0 credit for saving thousands if not more lives, yet the one time they make a mistake their are immediately put on the chopping block. Recognizing what policing should look like and idealizing that would go a long way.


No, since you are forgetting to take into account the probability of an outcome/risk analysis.

The probability of the local police being used against your city is so low, that the benefits are deemed to outweigh the risks. On the other hand, imagine software that lets you spy on all your employees' personal opinions, being sold as a way to "help strengthen culture-fit". That is something where the negatives may vastly outweigh the positives.


Yes, That's why most people are against the use of military equipment by police even if they're not against defunding them.


First off: no, that is not the abolish the police argument. The argument for police abolition is as part of a massive restructuring of how we think about crime, not how we think about police. It's about taking our focus away from police themselves (the antithesis of 'imagine the police doing horrible things to you') and onto the factors that make people victims of the police: systemic racism, drug addictions, mental illnesses, and the exploitative nature of capitalism at its worse. When we redirect our attention to solving the root issues, focusing on policing is redundant and harmful.

Second off, you're totally right. Abolish the police.


Many of the people who say they want to "abolish the police" actually just want to be the police, but under a different name and under their rules.


This makes the false assumption that authoritarians won't abuse the system simply because non-authoritarians didn't. Authoritarians gonna make up their own authority regardless.


> Authoritarians gonna make up their own authority regardless.

I think they’ll try to, but not always succeed. They will definitely use and abuse any existing powers.

So I think it’s logically unsound, and quite sad to me, to argue that it’s pointless to resist allowing powers that can be exploited because authoritarians will do it anyway.


> They will definitely use and abuse any existing powers.

They'll also abuse new powers, particularly known-but-unused powers.

My point is that it's better to make the best rules you can for now, rather than limit yourself simply because a bad guy will someday make it worse. They'll make it worse regardless.


I really appreciate your points on this. I've come to believe it's impossible to create an incorruptible system. People have choice and can change anything. The harder you fight it, the more subtly it happens.

The best system is one that continually generates a numbered and capable enough cohort unified in a common enough purpose to shepherd it long enough to create yet another generation of shepherds.


> They'll also abuse new powers, particularly known-but-unused powers.

Sounds recently familiar. I watched my state, city, ward, public health office and everyone in between or associated to “discover” new powers as a result of Covid.

My state did not do comparatively well over 2020. All we had to do we what was logical and we could have made these decisions early.


They discovered old powers, really.

The Spanish flu was a thing way back when


This is very insightful. I've seen too many employees in big tech continue to shill their products and services because it helps their CEO or it helps their immediate team -- without thinking about who is being hurt (e.g. end users consuming snake oil) or how the product can be weaponized against the people it is supposed to be helping. This is as true for small startups as it is for companies with millions of end users.


This approach only works if you expect other people to do the same. I don't want to be thrown in prison for my beliefs. I'm not certain that "hey, we let you march in public" is going to stop fascists from throwing me in prison or outright killing me.


That can only happen if you gave government the ability to throw you in prison for that, and then somehow a tiny minority got into government and abused the power you gave them.


> That can only happen if you gave government the ability to throw you in prison for that

Why? The universe won't stop anybody from doing that just because laws exist. Authoritarians will not be halted by existing laws or norms. That's what I'm worried about. And that's why things like preventing fascists from organizing in public has merit.


> Authoritarians will not be halted by existing laws or norms

I think this is exactly what they will be halted by. 1930s Germany was full of propaganda against Jewish people and other countries. Why bother doing that if authoritarians can just snap their fingers and force people to do things?


The Liberal state has a monopoly on violence: whatever their violence is, like throwing someone into prison, is by nature justified in the eyes of the state.


I support preventing Roko's Basilisk.


That's a great approach. I like the concise way of stating it. I'll apply it to a couple things that are popular topics right now.

---

Automated fact checking:

The negative consequences are pretty obvious. It's easy to cheer when the target is neo-Nazis who want to overthrow the government. Well what if the shoe were on the other foot, and Republicans had been able to compel social media companies to flag posts disputing Trump's claims of election fraud as misinformation? It's not exactly censorship, but it's a ridiculously powerful lever for manipulating public opinion.

On a more mundane level, clamping down on non-mainstream opinions could cause a lot of low-level chronic harm. For example, it's not hard to imagine social media "fact checking" disrupting discussions on fitness and/or nutritional science to promote the food pyramid and the importance of a low-fat diet for heart health, or to shut down conversations about medical uses of cannabis because the DEA still has it listed as schedule 1.

Maybe there's a reasonable middle ground, but it's dicey either way.

---

Killing or reforming the filibuster:

The obvious negative consequence (from a center-right to left-wing perspective) is that Republicans will regain a trifecta of power and find themselves with carte blanche to pass all sorts of wildly unpopular laws eviscerating civil liberties.

Even so, I say do it. The filibuster is massively advantageous to Republicans because it gives them the ability to complain about problems while offering few or no solutions. I say we call their bluff, and risk giving them the opportunity to pass their agenda.

Either they'd still do nothing (in which case they'd lose a lot of single-issue voters), or they would do the things that they claim to want to do (in which case they would lose the next election in a landslide and never hold power again).

As-is, they're stuck in between a rock and a hard place trying to appeal to:

* Pro-life Christians

* Gun owners

* Right-leaning libertarians

* "Selfish"/anti-tax rich people

* The alt-right / neo-Nazi / Q cultist crowd

* Populists (who may not necessarily be conservative, as evidenced by the overlap in support for Trump and Bernie)

* People with conservative social/cultural values

* Typical center-right conservatives (to the extent that they still vote Republican consistently, or at all)

That's just what I can think of off the cuff, but even that is a pretty diverse coalition. All they really have in common is that they oppose (or, in some cases, believe they oppose) various parts of the Democratic agenda (both real and imagined). Their continued unity depends on the GOP remaining a superposition of all the different values they each independently project onto it. The second the GOP actually gets a chance to pass a major law along partisan lines, whether they choose to do it or not, the superposition collapses and shoes will start to drop.

What do you think will happen if they take power and proceed to ban all abortions, remove every form of gun control, repeal the Affordable Care Act, take federal action against vaccine development/distribution during a pandemic, pass or shoot down a relief bill during a pandemic, escalate or deescalate the Drug War, dramatically increase or decrease environmental regulations while a climate disaster affects a red state, and/or dramatically alter regulations on the Internet / social media / E2EE / cryptocurrency? What if, with a legislative majority and in the absence of the filibuster, they don't do any of those things? I suggest that any action or lack thereof would be a huge blow to their support in some of those groups; they would have to pick their poison.

The wildcard here is if they were to use such a trifecta combined with their current dominance of the Supreme Court to enact anti-democratic reforms to prevent any further transfer of power. However, seeing as this is already the direction we're heading in, I would say that it's vitally important to override the filibuster and pass voting rights legislation now so that we have a stronger chance at remaining a democracy, rather than accept the massive gamble of doing nothing.

The greatest threat facing humanity today isn't climate change. It's the current Republican Party, and the prospect of world's most powerful military and nuclear arsenal ending up in the hands of a hypothetical future theofascist America.


I think you are waging political warfare on HN, and violating the guidelines of "Please don't use Hacker News for political or ideological battle. It tramples curiosity."

It is important to never demonize your opponents too much, if you are going to try and remain civil with them.

>* Pro-life Christians

>* Gun owners

>* Right-leaning libertarians

>* "Selfish"/anti-tax rich people

>* The alt-right / neo-Nazi / Q cultist crowd

>* Populists (who may not necessarily be conservative, as evidenced by the overlap in support for Trump and Bernie)

>* People with conservative social/cultural values

>* Typical center-right conservatives (to the extent that they still vote Republican consistently, or at all)

This smacks of making a list..

> The greatest threat facing humanity today isn't climate change. It's the current Republican Party, and the prospect of world's most powerful military and nuclear arsenal ending up in the hands of a hypothetical future theofascist America.

Please reconsider how you post in the comments section


Please don't use Hacker News for political or ideological battle. It tramples curiosity.

I didn't attack any particular ideology. I politely described my opinion on a specific organization that attempted to overthrow my country's government, first via a soft coup and then by force, as it related to the current discussion.

This smacks of making a list..

And? Are you suggesting that HN is anti-lists?


> Are you suggesting that HN is anti-lists?

This gave me a seriously hearty laugh. They're just in bed with "Big List"!


> The greatest threat facing humanity today isn't climate change. It's the current Republican Party

> I didn't attack any particular ideology.

Well, sorry, my fault, it just seemed that when you aligned anyone right of center with neo-Nazis that some people probably took it that way.


when you aligned anyone right of center with neo-Nazis

Again, I didn't say that.

Furthermore, while it's entirely beside the point, I personally do hold many political positions that might be considered right of center in America; for example, I'm in favor of liberal gun rights and a restriction on late-term abortions.

When I use the term "neo-Nazi", I mean it in a literal sense, not as an insult directed at conservatives. In fact, I went out of my way to distinguish between the center-right and neo-Nazis. There's a big difference between advocating for a balanced budget and joining an armed white nationalist militia to fight a satanic pedophile cult. Pretending otherwise is an obvious bad faith argument.

If anything, by suggesting that these are the same people, you are the one equating conservatives with neo-Nazis. Call me far-left, far-right, or whatever else you want; I just don't see someone like Bush/McCain/Romney storming the Capitol with a Confederate flag while calling for the assassination of acting government officials.


Your example that Republicans are the big tent party doesn’t make a lot of sense to me. All those groups venn together a lot. Tell me how well LGBT, Muslim, Jew, Union blue collar, Black, Hispanic, costal elite, youth, and big tech get along if you lock them in a room together if all you have is “at least we aren’t Republicans!”.

>Republicans will regain a trifecta of power and find themselves with carte blanche to pass all sorts of horrible laws eviscerating civil liberties.

>Even so, I say [kill the filibuster].

Ok, seems like you know it’s short sighted but want to “win” just to do so, even if temporary and potentially catastrophic.


Your example that Republicans are the big tent party doesn’t make a lot of sense to me.

It shouldn't, because that isn't what I said.

Ok, seems like you know it’s short sighted but want to “win” just to do so, even if temporary and potentially catastrophic.

I also didn't say that.


I thought this would finally help many rabid anti Trump people realize giving so much power to the gov was a bad idea but since the major media outlets and tech companies seemed to be part of the "resistance" this association of "limiting power" for the WCS never actually transpired.

I don't expect people to be persuaded by your argument. They'll just demand and increase of power to further their idelogies that they deemed objective and righteouss truth.


...which is exactly how I concluded that the censorship we see online is basically necessary. The worst possible ways that online speech can be used against me include being murdered by a lone-wolf terrorist who received instruction online. The worst that can happen to me from censorship being imposed is that I will not be allowed to say certain things online or challenge those in power, or possibly that I will be excluded from these social networks entirely because the people in power do not like what I said -- which is certainly bad, but not quite as bad as being dead.

The unfortunate truth is that terrorist organizations use mainstream social networking sites to recruit and radicalize. We are sitting here talking about how power has been concentrated and how everyone is subject to censorship -- as if we did not just spend a decade watching ISIS and violent white nationalist groups amplify their messages by evading the controls that existed in legacy media outlets. "This is why we can't have nice things" is the expression that comes to mind...


> which is certainly bad, but not quite as bad as being dead.

This sort of reductionist argument doesn't work. If you start from the premise "any chance of dying is not worth it", then you won't do much of anything. Driving to the store could kill you, so now you can't go to the store?

You have to look at probability when assessing risk. Is a 0.0000001% reduction in your risk of death worth sacrificing all your personal freedom? I think most people would say no.


Except that it is not a miniscule chance of some violence occasionally being committed. ISIS nearly succeeded in establishing a new country in the territory they captured and it is absurd to pretend that they had not exploited poor moderation on major websites to recruit large numbers of people to their cause. White nationalists and neo-nazis have been equally effective in their use of social media to recruit members and to spread their propaganda.

We are not talking about isolated incidents or hypothetical scenarios. Extremists in the US and Europe are becoming part of the political mainstream because so many people believe the extremist propaganda they are reading on social media platforms. Those same extremists have inspired an increasing number of terrorist attacks as their propaganda has spread. For someone like me, someone who is part of a minority group that is frequently targeted by those terrorists, that represents an immediate and growing danger.

Really though, this entire debate is poisoned by extreme positions on free speech. I remember watching as unmoderated Usenet groups were overrun by neonazis; everyone fled to moderated newsgroups or off of Usenet altogether to some better-moderated platform. Free speech absolutism has never worked well and it is juvenile to pretend that the choice is between "sacrificing all your personal freedom" or taking an absolutist approach to free speech.


Benjamin Franklin once said: "Those who would give up essential Liberty, to purchase a little temporary Safety, deserve neither Liberty nor Safety."

I will not sacrifice free speech for added protection from terrorists.



Franklin used that line in more than one context. He also said "The Massachusetts must suffer all the Hazards and Mischiefs of War, rather than admit the Alteration of their Charters and Laws by Parliament. They who can give up essential Liberty to obtain a little temporary Safety, deserve neither Liberty nor Safety."

https://franklinpapers.org/framedVolumes.jsp?vol=21&page=497...

I think less important than the nuance of what Franklin meant in any specific statement, is that what most people mean when they use that quote is its literal meaning, and that if you look at Franklin's life and body of work he was clearly a staunch defender of the idea that Liberty is a God-given (or inherent) right of mankind, and one worth defending.


Interesting. So he wasn't saying what people today think he was saying.

Let me fix it:

Those who would sacrifice freedom for temporary security will eventually lose both and deserve neither.

- me, 2021

Doesn't quite carry the same weight with my name behind it, but I stand by it nonetheless.


I agree with your sentiment, in part, but I see this quote used in this context so often, that I felt that it's worth pointing out.


Yes, it was interesting to read about the historical context, thanks for pointing it out. I also think the contemporary interpretation of literature, or a quote in this case, often matters more than what the author originally intended - it can take on a life of its own.


I have read all of these explanations multiple times and have always had trouble reconciling everything here.

What do the words “liberty”, “purchase”, and “security” mean in this quotation?

As best as I can tell:

- liberty: the right to be sovereign and able to defend oneself

- security: safety from having taxes levied

- purchase: the guys being taxed said instead “look, we’ll give you money this one time instead but we don’t want you to have the right to levy taxes”

Okay, I guess that makes sense.


I wouldn’t necessarily assume that purchase means with taxes. I think there’s an interpretation this is a non-monetary cost.


If you don't mind, can you kindly explain?


Not that context ever matters to extremist:

“It is a quotation that defends the authority of a legislature to govern in the interests of collective security. It means, in context, not quite the opposite of what it's almost always quoted as saying but much closer to the opposite than to the thing that people think it means.”

https://www.npr.org/2015/03/02/390245038/ben-franklins-famou...


Isn't it the same concept (giving up freedom/power in a shortsighted way), just a different population (government liberties vs individual liberties)?


> Not that context ever matters to extremist:

Are you calling me an extremist?


Free speech absolutism has not worked very well. Usenet was overrun by neonazis and all the interested discussions moved to moderated newsgroups/mailing lists/forums. A genocidal terrorist organization made effective us of various social media platforms to recruit large numbers of people to their cause and the world is a better place now that they have been banned.

Right now we are missing one of our best chances to end the COVID pandemic -- widespread vaccination -- because too many people are spreading lies about the vaccine on social media.

There is plenty of room for legitimate political debate, where people passionately advocate their preferred policies on various issues, without having to give a platform to people who are not arguing in good faith and whose real purpose is to advance a violent agenda. The cost of "free speech" is not "giving terrorists a platform to recruit and spread propaganda" and the politicians of Ben Franklin's generation actually did understand that (shortly after his death Congress passed the Sedition Act, which banned false statements about the US government to prevent foreign agents from destabilizing the newly formed country).


> Right now we are missing one of our best chances to end the COVID pandemic -- widespread vaccination -- because too many people are spreading lies about the vaccine on social media.

That's mostly a trust issue. If you silence their concerns you're just going to confirm that it's a grand conspiracy theory. You have to fight misinformation with the truth, with dialog, not with censorship.


It's a trust issue for some; others simply don't want to be told what to do. The more you shove the "get vaccinated" message in their faces the more they dig in and reject it.


the "concerns" are mostly coming from a small number of people that you probably wont be able to convince. Better to not use your platform to amplify them

https://www.npr.org/2021/05/13/996570855/disinformation-doze...

Just 12 People Are Behind Most Vaccine Hoaxes On Social Media, Research Shows


This should be a non-issue. If these 12 bad guys have no points, no arguments, no good faith motives - that’s them against the entire western scientific community, the entire media, almost all politicians, celebrities, unions, vested interests…

How do you explain these “12 people” aren’t immediately shut down with all the facts?


Because facts aren't enough to win an argument. Especially when people pushing false conspiracy theories don't care about being consistent or accurate.

Especially when you need to "win" with over 95% of the population and not just a plurality or majority.

Also sadly some news media, some politicians, (don't know about labor unions sadly would not be totally surprised)some celebrities and vested interests are pushing harmful anti-vax agenda that hurts everyone.


And how many people did those neonazis actually effect? How many more people saw that and immediately criticized the neonazis who wouldn't have seen them before?


The answer to your questions are very many, and some. The numbers are of course relative, but consider the following:

During the 1980's early internet, white supremacist groups were among the first[0] to being using the new medium for organization and information purposes. They used it then to publish among other things a list[1] of "race traitors" etc including name, address, phone number, promulgate misinformation, gaslighting established norms and history (ex: Holocaust Denialism), and develop strategies for what can really only be described as terrorist indoctrination in many respects.

Some of the group involved killed a man with automatic weapons and hijacked an armored car with millions in cash to finance a separatist uprising. One of these was Louis Beam[2] who was a quite violent seditionist, and developed the "lone wolf" militia cell structure which is familiar today. Beam used these telecommunication/internet networks to create and distribute a lot of white separatist information. His activity goes on and on, it is quite vile in all respects. He has been charged and acquitted of sedition.

In this academic piece by sociologist Chip Berlet[3], he recounts attempting to counteract the white supremacy BBS with an anti-racist BBS at an Anti-Klan symposium. The understanding of BBS was quite poor at the time. By the 1990's the white-supremacist BBS network had grown quite a bit, distributing newspapers and operating file transfer and messaging services into a national network of neonazi BBS including Stormfront[4], which is of course still in operation, and is quite influential. They successfully transitioned to the ordinary internet and also AOL, using them as very effective recruitment tools.

Neonazi/white supremacist/separatist/seditionist groups have used the internet very effectively pretty much from the beginning. Perhaps this is an effect of Johnathan Gabriel's Greater Internet Fuckwad Theory[5] as well as some kind of operationalized Poe's Law--race rallys thrive in protective shade. KKK marches and the like are routinely confronted by anti-racist counter-protests, but the current nature of online discourse continues to provide an asymmetric advantage to these types of activities. The old "Filter Bubble" doesn't lend many opportunities for normal people to insert themselves in the radicalization process...this could possibly be better than worse.

The literature on this is vast, exploring how a normal person can become radicalized into a racist white separatist is a strange rabbit hole to descend.

There exists a kind and inspiring man named Daryl Davis[6] who is pretty good at converting KKK/supremacists (he has many surprising success stories) away from this kind of behavior, but notice how his methodology requires a personal touch and much compassion. How many "more people saw that and immediately criticized the neonazis who wouldn't have seen them before?" is not a very good discriminator for this activity at all. Effectively, "None" is the real answer to your question.

The fact of the matter is that toxic memes and divisive trolling are consumed by people while on the can, idle-ly (or perhaps compulsevly) skimming social media and whatnot. The uncritical ingestion of this kind of thing simply habituates people to these kind of beliefs. I don't think a person who has fallen for this stuff is necessarily bad at first blush, and surely have many possibilities for redemption, but the effort required is really not the kind that is easily rallied.

It's a complicated notion, but it boils down to the fact that you have to fight Hate with Love.

[0] - https://timeline.com/white-supremacist-early-internet-5e9167...

[2] - https://www.splcenter.org/fighting-hate/extremist-files/indi...

[1] - https://www.nytimes.com/1985/02/15/us/computer-network-links...

[3] - http://citeseerx.ist.psu.edu/viewdoc/download?doi=10.1.1.552...

[4] - https://en.wikipedia.org/wiki/Stormfront_(website)

[5] - (original source unavailable) https://en.wikipedia.org/wiki/Penny_Arcade#%22Greater_Intern...

[6] - https://www.theguardian.com/music/2020/mar/18/daryl-davis-bl...


Are there less neonazis now than the 1980-90s? Are there more people now than ever who totally disavow white supremacy?

What is the population the KKK right now? What percentage of America is that?

Don't get lost in the narratives.


No, don't get lost in the insincere gaslighting.


Can you answer my questions? They paint a different story than what the NYT is telling you.

I'm being very sincere I want to get to the truth. If one person of a specific group does something bad and it gets mainstream news does the size of that group and overall impact it has increase? Of course not.

If there are less neonazis now than in the early days of the internet then doesn't that mean that having the ability to see their information actually exposed their bad ideas and allowed people to see them for what they actually are? Consider it.


I am struggling to see how this relates at all to how I have answered your rhetorical questions from before.

As you are well aware, it is a material thing how the structure of this kind of indoctrination occurs in society and this I have adequately described, albeit very briefly, as it has a long and colorful history.

I would like to answer your question in another fashion: Is Stormfront the KKK?


The argument that we should let neonazis/etc. parade their ideas around in public so that the world can see them for who they really are has become a lot less convincing because neonazis have refined their tactics (see e.g. "boots for suits"). They are not parading their hatred. Today they start with softer language, focusing on the supposed struggle of white people in America, how non-white people seem to be getting a leg up at the expense of white people, etc. Once they have drawn someone in, someone who for whatever reason found that the "great replacement" or "white genocide" theory resonated with them, they start to give the "explanation" for all the problems -- out of sight, away from people who might criticize them.


> The worst that can happen to me from censorship being imposed is that I will not be allowed to say certain things online or challenge those in power, or possibly that I will be excluded from these social networks entirely because the people in power do not like what I said -- which is certainly bad, but not quite as bad as being dead.

In the censored system, people in power can kill you, censor any talk about it, and no one will ever know.


> The unfortunate truth is that terrorist organizations use mainstream social networking sites to recruit and radicalize

If an actual terrorist organization is doing this, as defined by the FBI, then our government should be in charge of handling that.

Terrorist is already illegal, and we already have government organizations, in charge of tracking that stuff down.

And those government organizations have checks and balances, and judges, and have to follow certain rules.

These checks and balances do not exist, with other forms of moderation.


You certainly must love the way things are going in Belarus then.


I prefer the way things are run in countries like France and Germany, where people understand the value of free speech in debating politics and spreading new ideas while also recognizing that some limits are necessary (e.g. limits on speech that promotes a resurgence of nazism).


Every startup wants to be a new Facebook. Every modern society wants to be a new Sweden. Turns out that replicating successfully run communities - with their traditions, history and lifestyle - is extremely difficult, if ever possible.


The problem is, that the sentiment of "limits speech that promotes resurgence of X" is the same that dictatorships like Argentina used against the "subversives" that would promote and bring about communism.

When you allow the powerful to define "dangerous speech". Criticism of the government very quickly becomes dangerous when it's successful.

We've seen it with everything related to covid where in some countries people get arrested for "misinformation".

The slippery slope many warned about is very real, and very damaging.


That is an absolutist argument. Sure, some countries have used censorship in abusive ways, but I very specifically mentioned Germany. It was widely recognized after World War II that a resurgence of nazism would have been disastrous for Germany, and censorship was imposed to prevent the hundreds of thousands of committed nazi party members from trying to reestablish nazism. Censorship was absolutely necessary because propaganda had been so important in the rise of nazism in the first place, and because nazi ideas and the nazi world view did not simply die when Germany surrendered.

Despite very strict censorship laws related to nazi propaganda, nazi symbolism, and Holocaust denial, Germany ranks much higher than the United States in terms of press freedom and has ranked higher for many years. There is no serious argument that (modern, reunified) Germany is not a free society where people are free to criticize the government. Yes, there are limits to the ideas that are allowed to be publicly promoted or debated, because the same concerns about a resurgence of nazism remain relevant today (it has actually become worse in recent years).

The abuse of censorship laws on the part of dictatorships is not an argument against all censorship, any more than the abuse of the courts by dictatorships is an argument against the rule of law. Expansive censorship is a problem that works against the interests of a free society, but limited censorship is sometimes necessary to promote the interests of a free society.


If my argument is absolutist then your argument is authoritarian.

Mine is a safer proposition for the oppressed, for the minorities, for those not algined with the politics of the elites.

The whole argument about Nazi resurgence really falls flat when you realize that people get called Nazis for spousing wrongthink politics.

Your whole proposition is "the dangers of Nazism are too great". That's the same tactic I mentioned of "the dangers of communism" are too great.

If you want, ignore dictatorships. The slippery slope in non dictatorships is extremely damaging.

You're not gonna limit your censorship either, as you suggest in your last line. You're going to use it against your political foes to continue in or seize power.

Your motte and bailey tactics to get people to accept "some censorship for their own good" and then censor their speech that goes against your politics by calling it "Nazism" is a way bigger threat than the actual chance of "Nazi ideology". There was a time the ACLU safeguarded the right to say the most abhorrent of views the same you make sure due process is served even with the alleged most abhorrent criminals.

It benefits everyone.


I don’t see what the argument is that the marketplace of ideas isn’t working.

Facebook doesn’t prevent you from using Twitter, or Reddit or a small new network. All kinds of people can and do engage with multiple networks, big and small, at their own inclination. All of the platforms are accessed on the same internet using the same protocols and clients, and it’s trivial — completely trivial — to find alternate content or all kinds (at least in the US).

People aren’t abandoning the big platforms in droves because they generally like those big platforms. (Probably almost nobody thinks they are great, but that’s not the standard. The standard is better than the alternatives.)

As the article points out, the marketplace of ideas is leading Twitter to more flexible moderation mechanisms that let users choose their own level.

Seems crazy to change the rules, force companies to break their business models, when the system is working fine.

Doctorow (and others) may not like it, but many, many people like Facebook and Twitter a lot more than they dislike them.

Protective regulation should come into play when the affected populace is not in any position to consent to (or decline) the situation. E.g., because alternatives aren’t available, or because the issue requires special expertise to understand, etc. But generally this doesn’t apply here. (I do think there needs to be recourse for people who are banned or otherwise lose access to something they’ve invested in.)


Enragement equals engagement.

If that's the world you want, good news, that's the world we have now. And the magic of AI and recommender systems has disrupted that old and stupid model of if "it bleeds it leads" to get you outraged faster and with far less effort than it used to take in the old and stupid days.

But that's not the world I want to live in personally and if it takes some imperfect even sometimes ham-fisted regulation to get us to a better place I'm willing to deal with that rather than embrace the status quo because it sucks and it's just getting worse as the tools to hack social media have been democratized at scale.

Ideally I'd love to let the free marketplace of ideas work this out (and it eventually will and I am a huge believer in weakly efficient marketplaces), but unless I get a 1000 plus year lifespan, I only have so many years left in my lightcone so I side with being more proactive.


Perhaps our generation must suffer during an evolutionarily difficult time so that we do not end up in 1984?


Perhaps our generation must rely less on ridiculous hyperbole whose only benefit is to drive engagement and instead consider more compromise solutions to situations like this?


What is given up in the compromise?


"The free marketplace of ideas" (and also classical liberalism) is hugely reliant on the idea of humans as rational agents (both the people selling and the people consuming). But when we're not, everything opens up to tons of propaganda (especially that from fascists who are more than willing to engage where leftists will reject premises).

You can't have "enragement equals engagement" AND the "free" marketplace of ideas is good. The first is the natural consequence of the existence of the latter. If you take issue with the first, you should not accept the second either.

As for what should replace that 'free' market, I honestly don't know. But deplatforming (not censoring, deplatforming) hate speech is a good place to start.


Humans in my opinion and perhaps my opinion alone are ultimately rational agents prone to do the right thing but only after they've done all the wrong things. I don't have time for that here so I support experimenting with freedom of speech here and regulating obvious abuses of it like Facebook monetizing hatred and superstition. I don't know if I'm right or if I'm wrong here but that's my story and I'm sticking to it.


I tend to agree with this article: we all should be worried about being dependent on having only a few core systems where speech happens, especially when network effect & switching costs are very very high.

I believe platforms have the right to moderate themselves as they see fit. They get that freedom, and we're better for letting private systems regulate themselves. Or, (as mentioned) as Twitter is floating, creating an “app store for moderation”, making moderation an interoperable layer rather than integrated.

And I agree with where Cory eventually brings the discussion to: what makes everything all feel so impossible is that switching costs are astronomical. If we leave the one network our friend is on we lose all digital connection with that friend, lose the view we'd get of them interacting with others. Competitive Compatibility is needed, to let us allow private companies to create their own rules, but to not keep each of us restrained & restricted within a handful of supersized networks.


> “app store for moderation”

One amusing possibility about this approach is that if Twitter implemented it and governments decided that they didn't like Twitter's default censorship regime, Twitter could say "That's fine, just provide your own censorship system and write a law forcing that system onto all users in your country."

I suspect at least some governments would be reluctant to incur the financial and political costs of maintaining their own (inevitably imperfect and controversial) censorship regime, and would then find it harder to act at arms length and say "Twitter has to do more" as they do currently.


"Hey man, I really enjoy our friendship but I just can't do Facebook anymore for mental health reasons, what's your email?"

What people are really saying is they know it's bad but can't stop, they can't turn away, they can't say "no, I don't want to participate anymore." Then they blame their friends, "oh I'm just going where they go." I wouldn't want to "lose all digital connection with this person" well why not reach out to them? Said another way, if your friends and you don't maintain connections regardless of the underlying social systems, are you really friends? I've somehow carted a group of random online people along with me across various networks, fragmenting messengers, good old email, phone numbers, going in and out of contact since at least 2002. The underlying protocols might change but they're the same people. What prevents you from doing this?


Your friend is actively broadcasting to the world (except you) their photos, they're sharing articles & their opinion, they're writing on other friend's wall (and you can see). Social places are filled with activity. They are broadcast mediums that beget further broadcast interactions, which beget yet more broadcast interaction.

This cruel small hearted tough-love view of social you present reads as extremely unempathic to me. I don't see how you can miss so clearly the core idea of Metcalf's law, that the value of a network rises according to the square of it's nodes. And each interactable interaction, in my view, constitutes it's own node, is it's own potential for something new to start amid the network. Not being in that place can be enormously detrimental, and imo, Competitive Compatibility is absolutely an obvious, sensible public policy to insure that you can still be with your friend, even if you don't want to be with Facebook.

Reducing this to "why don't you stay in better touch" is like, "oh you don't speak the same language, why don't you two invent one?".


Why does any part of this need to happen in public? Not in a text group? Random discord server? Is it just some sort of underlying exhibition drive that Myspace originally tapped into? To me, my friends are the 15-20 people I've grown to know and like. From reading what you said here, it seems as though to you the word friends captures your entire extended network, with friends of friends, old coworkers, etc. If your question is how do you show off to these people, if you're optimizing for the number of 'interactable interactions' or something, we've left friendship far behind and turned this into a 'numbers go up' game. I'm not less friendly with my best friend because I didn't see her latest comment on some random meme an acquaintance from school 10 years ago posted.


You propose replacing public commons with something different, immediately showing you have gained no sense of what Metcalf's Law implies or what value social networks enable.

None of this sounds in any way realistic or like a viable replacement for what we are right now stuck with. Giving people broadcast, in public capabilities is different, it's easier. I wouldn't necessarily say it's better, but the ease & ambience of broadcast is a huge advantage, and leads to far more interesting mixing. Nothing you've proposed sounds in any way similar. Your words continue to amount to: withdraw from the public. I for one do not see that as a likely or desireable counter-conclusion for myself, for my friends and family, or for society.


This idea of “the public” is a distorted, commercially-mediated fantasy. Facebook is a product, it is not a public square.


> If we leave the one network our friend is on we lose all digital connection with that friend, lose the view we'd get of them interacting with others.

Ever since I started using the Fediverse, I haven't had this problem. Sure, most of my friends aren't there, but I'm happy to IM or email them instead.


yeah i actually like seeing my friends talking with each other & being able to join in the various conversations they're having.

if you're fine just leaving the online social circles your friends are in good for you. but i hope you can see how very very very very very short an answer & how rude is comes across to most people. and i'm sorry your friends are so boring.


Don't you think it's selfish to shame your friends if they decide they don't want to be part of a predatory social media company?


I think that's why Cyberpunk as a fictional genre is dead, it's too real.


I've been waiting for @cstross to finish the Halting State series but at last update, he appeared to opine that fiction was getting too close to reality. Thus he has declined to finish the series.

I felt this was a bit of a cop out at the time. The Halting State series had us thinking about these issues, which I think is better than not thinking, writing, and debating them. If reality is catching up to fiction, then write about reality, don't be afraid of it.


im just not worried at all ... this problem is only a problem for people that think social networks matter. They dont. These connections are vapor, their importance is only granted by the person that believes it to be.

To think that social media is important is to think being famous is important. Its the same mindset that makes people feel more or less confident based on how many likes they have.

Step away from the social media. Do something active in the real world. Learn a craft. Make a connection with someone in your area.

Please like this comment.


Does anyone remember the days when internet media companies didn’t want to censor anything? Because after all it costs more and puts them in a position where nobody is satisfied?

And do you remember when they were called onto the carpet by repeated congressional committees for…

Allowing copyright violations?

Allowing terrorists’ messages?

Allowing misinformation relating to elections?

Allowing hate speech?

Now censoring misinformation?

Do you remember when you were incensed by the inaction of media companies and demanded action, and now are dismayed that they are acting?


> Do you remember when you were incensed by the inaction of media companies

Not even a little bit. Some of us have been ideologically consistent that censorship is not helpful.


When the trump accounts were shut down, the political stance and attitude silenced the criticism. it's too late now. The social media accounts of the president of the usa have been closed. The account of a man who received half the votes from the USA has been closed. then the majority could not say it was wrong. now it's too late to talk and complain


The people on the "right side of history" are usually fighting for freedom and liberty, not censorship and re-education camps


Winners usually frame their fights as ones for freedom and liberty, and emphasize their enemy's use of censorship and re-education camps.


Something is either being censored or it's not. Regardless of the issue, I'll side with the side that is not censoring information.


A quick perusal through history will show that conflicts where one side is censorious and the other is not has basically never happened. Usually the winning side sweeps their misdeeds under the rug afterwards.

See: US behavior towards communists and left wing groups during the Cold War, or censorship during WW1 and WW2. The whole "fire in a crowded theater" trope comes from the unanimous Supreme Court decision upholding the conviction of someone peacefully distributing fliers protesting the draft in WW1[0]. Funny how that one usually doesn't make it into the history books.

0 - Schenck vs. United States. Thankfully no longer good law.


Most people are openly taught about the misdeeds of the McCarthy era in schools and in public discussions. This wasn't swept under the rug, most people agree that it happened and wasn't a good thing.


Eventual recognition of misdeeds is nice, but irrelevant when the original claim is that there is (or has been) one side that’s anti-censorship. Being pro-censorship and then apologizing later isn’t enough.


How much do they learn about the Office of Censorship set up following Pearl Harbour and the role the federal government thought it played in defeating Hitler and Japan? Or the history of battles over "obscenity" laws, often upheld by constitutional courts?

Learning "censorship is unAmerican, here's an example of how we tried it once and it was really bad so we ended it" is the definition of sweeping the nuanced reality of speech battles in the US under the carpet in favour of the free speech version of American exceptionalist myths.


People on the right side of history are the ones who won the war period.


The people on the right side of history in WWII locked up their own citizens based on their ethnic heritage.


Quite. The UK also had a formal censorship regime; America managed with an informal one voluntarily carried out by media organisations: https://censorshipissues.wordpress.com/2010/09/21/censorship...

(See the varied career of William Joyce: https://en.m.wikipedia.org/wiki/Lord_Haw-Haw from informing against the IRA to British fascism to German propaganda broadcaster; he was eventually hanged for treason.)


FDR's track record on individual rights is rather poor. Seizure of gold, censorship, and internment come to mind.


The long term threat of big tech censorship is that it is priming governments with an expectation that all online communication can (and therefore should) be policed - the exact opposite of the decentralized permissionless Internet dream. The longer they maintain bearable levels of censorship, the longer governments have to cozy up to the idea and expect to apply it to communications technology that lacks the centralization vulnerability. So ultimately, the faster Big Tech implodes and goes the way of Digg, the better. We should cheer when they stay well ahead of what governments want to censor - it makes it clear they are more like TV channels than letting them claim to be manifestations of "the Internet".


Mandating interoperability, with ActivityPub and similar protocols, would help.

In the long term I'd like to see a future where it's a normal thing for people to roll their own social media platforms, by putting a Raspberry Pi on their local network, downloading and configuring some open-source software, and the local nodes all talk to each other and doing so gives someone the same level of functionality that they today have with a Facebook/Twitter/YouTube/Tiktok/etc account.


That’ll never happen for the same reason why basically nobody runs their own online services today. Most people strongly favor the convenience of existing SaaS services, which is why dropbox beats out all the self hosted alternatives by orders of magnitude.


> Most people strongly favor the convenience of existing SaaS services

If it was more convenient to roll your own, then I contend that many people would, and many more would have accounts on friends' servers. The difficulty shouldn't be any higher than the cost of a Pi server and spending an hour setting it up -- all of which is perfectly possible to achieve technocally.


I believe that you are completely misguided on why people choose to not self host. I don’t think it’s about convenience, they just don’t care about the same things you care about.

There is a ton of easy to use self hosted software out there, some of which I run. Compared to their SaaS alternatives though they are extremely unpopular. For a lot of people “buy a raspberry pi” will always be too complex, too expensive (you’re competing against free), and too much long term hassle. Much easier to use the free service that everyone else uses which will be continuously updated with new features automatically.


> I don’t think it’s about convenience, they just don’t care about the same things you care about.

Some people do. The c. 1 million people who use mastodon or other fediverse services care.

> For a lot of people “buy a raspberry pi” will always be too complex, too expensive (you’re competing against free), and too much long term hassle.

This is true; it is also irrelevant. I'm immagining a world where maybe 1 internet user in 100 will have a Pi on their home internet connection, and maybe half the people on the internet will have accounts on other people's Pi servers.


meh … dropbox really?


Yeah, file storage/syncing is probably the best example of successful self-hosting for moderately "techy" people, with commercial NASes usually offering much better and cheaper services than cloud storage for home/office usage.


I’ve been writing a bit about that idea https://paulfrazee.medium.com/productizing-p2p-bff5aed95f6a


A little off topic, but wouldn't an easier fix for some of the addiction and misinformation bubbles be to legislate, by law, that all "feed apps" always show things newest to oldest and require users to consciously click, everytime, to show "top" or "hot". This would help a lot to fix the "zombie feed scrolling" in my opinion and prevent engagement algorithms from only showing polarizing content.


Actually the people getting banned and censored did Gab and a few others that were cut off from Paypal, Visa, Patreon and others. They have to go somewhere, and are not wanted on Big Tech social networks. They have been radicalized as a result. Believing conspiracy theories and alt-right groups.


My experience with peeking into Gab was being greeted by a suprisingly large amount of neo-nazi posts dominating the site.

From my experience you don’t really need to push neo-nazis into conspiracy theories – their ideology is already a one big conspiracy theory.


Also what about laws and governments ? This is a very american perspective IMHO to miss this part of the picture.

Some stuff should be regulated. Especially post of FB/Twitter/ect that are actually provably false and whose diffusion are nothing but a carefully crafted op from a foreign country.

you could be concerned about over-regulation, but currently we are at the opposite side of the spectrum.

Asking FB/Twitter - or any company - to do the right thing and go against its own interest is simply naive.


Honestly aren't there bigger problems in this world? These days it's all privacy and tech. What about governments going out of their way to deny people a roof over one's head?

Now I know that isn't necessarily eff's problem but it's a bigger problem than Facebook censoring stuff that doesn't align with their idea of political correctness. Don't get me wrong I didn't like it when FB censored one of my countries greatest traditions, but it's just one of those annoyances that you just need to accept.


Not that Cory is wrong but his words would hold more weight if boingboing hadn't been engaging in the behavior he decries for over a decade. Memory holing people who they no longer feel "are wonderful" and removing vowels from posts they don't agree with are just two examples of the type of activity they've long engaged in. These aren't cases of off-topic discussion, threats, or other attempts at derailing discussions, they're personality, romantic, or ideological disagreements that aren't allowed to exist.


Can you share links to works from people that you deem memory holed by EFF, please?


Article and linked proposal really just don't even attempt to tackle the privacy implications of all this. People don't like that WhatsApp knows your _metadata_, but we're supposed to be good with anyone being able to bring all the metadata of their entire social network to any random provider? How do you make such an argument in 2021, the year "privacy" went mainstream?


I wish this attitude was more prevalent. The most dangerous thing we can do is turn a non-partisan issue into a partisan one. As soon as it becomes about right vs. left, you have an army of people on Twitter, etc. who are ready to mindlessly shill a viewpoint, not because they know anything about it, but because it aligns with their political identity.


The proposed ACCESS Act discussed in the post appears to be at: https://www.congress.gov/bill/117th-congress/house-bill/3849...


No one has 'the truth', it's just not that sort of thing. Everyone has opinions. Some opinions are lies. Transparency is the way to deal with the lies.

But if you're the one lying and have your hands on the levers of power, censorship is a great tool.


Probably a good place for reposting this: https://publicseminar.org/essays/why-we-should-outlaw-oppres...


That's basically just saying that racism and hate speech are super ultra duper bad and we should outlaw things like that, as well as vastly expand what we consider beyond the pale so that it captures what are fairly mainstream opinions. I (as well as many others) disagree, and in fact I think that the people that argue for hate speech laws are far more dangerous than the tiny handful of actual white supremacists out there who are just losing their battle slowly over time.


Did you know that in the Aeropagetica, John Milton's 1644 pamphlet that is a landmark in the history of free speech, Milton literally said it was OK to censor Catholicism and superstition? [1]

Freedom of speech has always been subject to limitations, exceptions, and special pleadings. There is no absolute liberty, there is only freedom to the extent that society is indifferent to, or can at least tolerate.

"even opinions lose their immunity when the circumstances in which they are expressed are such as to constitute their expression a positive instigation to some mischievous act" [2]

[1] "Yet if all cannot be of one mind—as who looks they should be?—this doubtless is more wholesome, more prudent, and more Christian, that many be tolerated, rather than all compelled. I mean not tolerated popery, and open superstition, which, as it extirpates all religions and civil supremacies, so itself should be extirpate" https://www.gutenberg.org/files/608/608-h/608-h.htm

[2] https://www.theatlantic.com/magazine/archive/1919/11/the-bas...


ACLU is still silent about the latest attack on free speech. They still haven't opposed the White House's plan to enforce speech. I've cancelled my monthly donations to them because of it. Please do the same if you give them money.


I have never been comfortable with supporting the ACLU because of their "money is speech" stance. The political system of the US has become increasingly plutocratic, making it hard for popular policies to be enacted and make progress toward important issues like fighting poverty and climate change.


ACLU has been down the tubes for a while now, especially when it comes to political cases.

> ACLU Lawyer Cheers Suppression of a New Book

https://greenwald.substack.com/p/the-ongoing-death-of-free-s...

> ACLU Again Cowardly Abstains From an Online Censorship Controversy

https://greenwald.substack.com/p/aclu-again-cowardly-abstain...


Imagine if Mao and Sun Yat Sen were duking it out, and nerds blamed Facebook for the resulting social polarization.

Yeah man, and the printing press too. Those Protestant Reformation wars were no joke!


I am starting to think tech is bad period. If we could have avoide the industrial revolution it would have been mych better for humanity and for the planet.


The censorship debate always reminds me of the 2000s after 9/11 where anyone who opposed going to war in multiple foreign countries which had nothing to do with 9/11 were labelled "terrorist sympathizers". Taking advantage of this, they also passed a mass spying bill - the Patriot Act. Stating "man cannot be a woman" or "men shouldn't compete in women's sports" is considered transphobic in current climate. Regardless of which side of the Palestine-Israel conflict you fall one, either or both sides can be censored by labelling it as "anti-semitic" or "islamaphobic". Speaking out against the Patriot Act and forever wars would be (and was) labelled "unpatriotic" and "terrorist sympathizer" and censored. Speaking out against healthy living would be labelled "fat phobic". Or the opposite could also happen. All the groups which speak out against fat shaming could also get censored because it's dangerous to public health. The government and big tech would start censoring you for advertising soft drinks/chips and other junk food. Since all religions oppose gay marriage, when super religious are in power, they would censor LGBT content. Pro-choice would get censored for religious reasons and Pro-life would be censored for racism/dangerous or other reasons.

Just a few days ago there were 2 posts on HN:

Rival weightlifter speaks out on transgender Hubbard's Olympic place:

https://news.ycombinator.com/item?id=27598383

The blackout Palestinians are facing on social media:

https://news.ycombinator.com/item?id=27645282

In the first case, my women friends are too afraid to speak up because they are afraid of being labelled a transphobic. 3 of my female friends have lost sports scholarships to biological men.

In the second case, the reasoning provided is "Criticism of Israel is Anti-Semitism. Really.":

https://blogs.timesofisrael.com/criticism-of-israel-is-anti-...

Here is a partial historical list of scientific consensus "deniers" proven right which this sort of censorship will either silence by big tech or due to self-censorship:

1. Ignaz Semmelweis, who suggested that doctors should wash their hands, and who eliminated puerpal fever as a result, was fired, harassed, forced to move, had his career destroyed, and died in a mental institution at age 47. All this because he went against consensus science.

https://en.wikipedia.org/wiki/Ignaz_Semmelweis#Conflict_with...

2. Alfred Wegener, the geophysicist who first proposed continental drift, the basis of plate tectonics, was berated for over 40 years by mainstream geologists who organized to oppose him in favour of a trans-oceanic land bridge. All this because he went against consensus science.

https://en.wikipedia.org/wiki/Alfred_Wegener#Reaction

3. Aristarchus of Samos, Copernicus, Kepler, Galileo, brilliant minds and leaders in their field all supported the heliocentric model. They were at some point either ignored, derided, vilified, or jailed for their beliefs. All this because they went against consensus science.

https://en.wikipedia.org/wiki/Galileo_Galilei#Controversy_ov...

4. J Harlen Bretz, the geologist who documented the catastrophic Missoula floods, was ridiculed and humiliated by uniformitarian "elders" for 30 years before his ideas were accepted. He first proposed that a giant flood raked Eastern Washington in prehistoric times, and who suffered ridicule and skepticism until decades of further research proved his thesis. All this because he went against consensus science. He was eventually awarded the Penrose Medal.

https://www.seattletimes.com/entertainment/books/bretzs-floo...

5. Carl F. Gauss, discoverer of non-Euclidean geometry, self-censored his own work for 30 years for fear of ridicule, reprisal, and relegation. It did not become known until after his death. Similar published work was ridiculed. His personal diaries indicate that he had made several important mathematical discoveries years or decades before his contemporaries published them. Scottish-American mathematician and writer Eric Temple Bell said that if Gauss had published all of his discoveries in a timely manner, he would have advanced mathematics by fifty years All this because he went against consensus science.

https://en.wikipedia.org/wiki/Carl_Friedrich_Gauss#Personali...

6. Hans Alfven, a Nobel plasma physicist, showed that electric currents operate at large scales in the cosmos. His work was considered unorthodox and is still rejected despite providing answers to many of cosmology's problems. All this because he went against consensus science.

https://en.wikipedia.org/wiki/Hannes_Alfvén

7. Georg Cantor, creator of set theory in mathematics, was so fiercely attacked that he suffered long bouts of depression. He was called a charlatan and a corrupter of youth and his work was referred to as utter nonsense. All this because he went against consensus science.

https://en.wikipedia.org/wiki/Georg_Cantor

8. Kristian Birkeland, the man who explained the polar aurorae, had his views disputed and ridiculed as a fringe theory by mainstream scientists until fifty years after his death. He is thought by some to have committed suicide. All this because he went against consensus science.

https://en.wikipedia.org/wiki/Kristian_Birkeland#Legacy

9. Gregor Mendel, founder of genetics, whose seminal paper was criticized by the scientific community, was ignored for over 35 years. Most of the leading scientists simply failed to understand his obscure and innovative work. All this because he went against consensus science.

https://en.wikipedia.org/wiki/Gregor_Mendel#Initial_receptio...

10. Michael Servetus discovered pulmonary circulation. As his work was deemed to be heretical, the inquisitors confiscated his property, arrested, imprisoned, tortured, and burned him at the stake atop a pyre of his own books. All this because he went against consensus science.

https://en.wikipedia.org/wiki/Michael_Servetus#Imprisonment_...

11. Amedeo Avogadro's atomic-molecular theory was ignored by the scientific community, as was future similar work. It was confirmed four years after his death, yet it took fully one hundred years for his theory to be accepted. All this because he went against consensus science.

https://en.wikipedia.org/wiki/Amedeo_Avogadro#Response_to_th...


I posted my public stance on this topic here: https://www.remarkbox.com/remarkbox-is-now-pay-what-you-can....


Censorship is a symptom. It is a means. The ends is control. "The Age of Surveillance Capitalism" changed my understanding and view of not only Big Tech, but the broader political context as well.

https://en.m.wikipedia.org/wiki/The_Age_of_Surveillance_Capi...


The reason that centrists support private censorship now is very simple:

We are now facing disinformation campaigns that have apocalyptic consequences, and normal legal channels are quite ineffective at battling them.

"Well of course, it's not the government's job to decide between true and false, nor to enforce it!" you say.

Bullshit, I say.

Slander is illegal. False advertising is illegal. Fraud is illegal. Defamation is illegal. Perjury is illegal. Filing a false report is illegal.

There are a plethora of torts and laws where you will get sued or jailed for lying about something important (like a court matter) and in those cases the truth is a defense! That means a court decides what is true and what is false.

So we've admitted that we're willing to abridge free speech to protect truth. That's now established. But we're only willing to do it for cases where somebody can show clear damages, and where there's a clear target to sue.

A diffuse, widespread misinformation movement against a concept and not a person? Like climate change, or vaccines, or covid, or Judaism at large? Those aren't protected at all. And why not? Because there isn't a shareholder of climate change who can show that your misinformation has unjustly damaged his share price?

So given this legal vacuum, is it any wonder that the reality-based community has embraced private censorship? Nobody loves this solution, but the alternative is letting one third of the population doom the other two thirds by obstructing the kind of actions we need to survive.


> There are a plethora of torts and laws where you will get sued or jailed for lying about something important (like a court matter) and in those cases the truth is a defense! That means a court decides what is true and what is false.

Yes. Courts do that. They decide on what the facts are, and on how the law applies to those facts.

I don't have a problem with a court doing that. I have a problem with Facebook doing that. Even more I have a problem with some government agency telling Facebook to do that (unless the government agency is a court, and they have a finding of fact on that particular issue).

And, "the reality-based community"? Was that the community that agreed that masks wouldn't help? Or was it the exact same community a month later, that said that masks would help? I mean, it's good that they're trying to follow the evidence. But no, I won't let them censor, because they've been wrong before, and will be again.


>Like climate change...

The problem here is that there is not some clear-cut answer to "what is to be done" about climate change, and unless you restrict yourself to the benign observation that "the climate changes", everything else can be categorized as "misinformation" if you have a political objective. Agree that climate change is happening, but don't agree with a massive spending project to address it? Too bad, you've just committed misinformation.

>Nobody loves this solution, but the alternative is letting one third of the population doom the other two thirds by obstructing the kind of actions we need to survive.

I just don't get this at all. What doom? Are you implying the vaccine doesn't work and you can still catch COVID if a bunch of other people don't take it?


return to 1930-1945 germany it looks like?freedom of speech is such an important right and privilege, it needs to be saved, and so does democracy! let's not repeat the past


What you should be worried about is people claiming to have the right to use your private property and when violating your terms of service complain about "censorship." It's no different than inviting someone to your home for dinner whereupon they go on a racist/political rant to the point where you ask them to leave, and then they complain loudly about censorship.

It's not censorship.


It was certainly weird to see all the "they're private businesses, they can do what they want!" takes from the internet when Trump and Parler were banned, after the same people have been yelling for years that these same corporations have too much power over the public narrative.


Prior to Trump getting banned, an outrage was mounting over Facebook/Twitter banning all kinds of content that was seen as far less dangerous, yet continuing to enable what was seen as extremely dangerous content from Trump.

It was outrage over inconsistent policy it wasn't your cynical read that people are just self-interested morons.


> It was certainly weird to see all the "they're private businesses, they can do what they want!" takes from the internet when Trump and Parler were banned, after the same people have been yelling for years that these same corporations have too much power over the public narrative.

Why?

Its not inconsistent to argue, as many on the left have, both before and after the right-wingers started whining because they lost their excessively favorable treatment from Twitter, Facebook, et al.: (1) Free speech means that private actors should have the right to control the messages their resources are used to relay. (2) A subset of the tech companies right-wingers complain about to justify trying to impose state-backed mandates to carry right-wing content on internet providers generally are overly dominant monopolies and have excessive power over public communication because of that, which is a problem that is should be dealt with by dealing with the monopolies, not curtailing the freedom of speech of private actors.


> excessive power over public communication [...] should be dealt with by dealing with the monopolies, not curtailing the freedom of speech of private actors.

That sounds like a nice idea. However, I'm seeing it for the first time. While (1) and (2) aren't inconsistent, typically only (1) is brought up.

Take your average thread about how, say, Google is now beginning to censor X thing. It makes little sense to defend big tech censorship with point (1), leaving out (2), if you actually hold both opinions (1) and (2). The latter kind of implies that you agree big tech censorship is a problem. The former in isolation points toward the opposite. Therefore, I don't think most people who argue (1) actually believe in (2) as well.


This - the solution isn't to compell overly large websites or web hosts to carry speech that they don't want. It's to use antitrust law to break them up so more smaller websites can viably carry whatever they want.

Now there should also be a clear demarcation of what internet services need to be a common carrier vs not. Let's say bandwidth, colocation, ip address allocation, and domain registration services were required to take all comers. Then if you want to have your gay commie gun club forum, you could get a domain, buy your own hardware, put it up in a colo facility, get bandwidth and ip addresses, and then have an uncensored presence on the net.


I'm not concerned. If I ran a platform serving out qanon BS I'd ban and log their details. Terrorists don't get protections from a private company, nor should they. People don't understand just how insane these groups of people have become. Without FBI interference these terrorist groups would have murdered multiple state governors and carried out plenty of bombings. I'm not going to "be neutral" to such people.

1. Gretchen Whitmer kidnapping plot, https://en.wikipedia.org/wiki/Gretchen_Whitmer_kidnapping_pl..., "The Wolverine Watchmen group had been recruiting members on Facebook from November 2019 until June 2020, when Facebook began purging all boogaloo-related material."

2. AWS datacenter attack, https://www.datacenterknowledge.com/amazon/aws-data-center-s..., "Management has warned Amazon data center staff to be on the lookout for any suspicious activity following a comment on Parler that suggested “someone with explosives training” could “pay a visit to some AWS data centers,""

3. Sovreign citizen movement + violent plots, https://www.splcenter.org/fighting-hate/extremist-files/ideo...

4. Jan 6th : Pauline Bauer - https://youtu.be/rvgwR4kfQVs, Participated in the insurrection because "Democrats are behind pedo rings" - this is fabricated propaganda perpetuated on the net that we can how see has actual, real world harm.

5. 8Chan, https://blog.cloudflare.com/terminating-service-for-8chan/, hosted many form of illigal, harmful and predatory content, namely directions/links to CP, animal abuse, Revenge porn, coordinated harassment of groups and people for the purposes of suiciding other people. This is still there on their new site, but it is now a heavy hotbed of conspiracy nuts that believe they're the victims.

Plenty more that I can't remember off the top of my head.


>Plenty more that I can't remember off the top of my head.

You should familiarize yourself with the rest of post-War 20th century history in the United States, then. Far, far more deadly and dangerous "conspiracy theories" existed before the internet and caused actual deaths of sitting politicians. It's the Presentism of all of this that is making people forget how pernicious it is, letting these powerful organizations have this authority to ban "misinformation".


We have not yet seen all to come from these conspiracy nuts. Jan 6th was a beginning, "militias" are itching to shoot anyone to "defend their freedom". The fbi did their job for a number of the attacks I listed, or else the governor of Michigan and Virginia would be dead, and a military response would have had to kill dozens of "militia" members when the state arrests them.


> Far, far more deadly and dangerous "conspiracy theories" existed before the internet and caused actual deaths of sitting politicians.

Care to name an example?


Here's an example that often gets memory-holed.

https://en.wikipedia.org/wiki/1954_United_States_Capitol_sho...


It’s really interesting to see how quickly things have changed. The world watched in horror at Jan 6th, we decided we needed to be better than that and stopped a certain someone and friends who were spreading blatant lies that millions chose to accept as truth because it served an end goal, ignoring all evidence and facts disapproving..

Today you and others who point out what brought us here are downvoted, others like myself flagged.. as if the circumstances that brought us to this junction means nothing.

You’re right though, CP is illegal, revenge porn we decided as a society is unacceptable, we have slander and liable laws, etc… we’ve drawn some lines in “free speech”, and we discovered another one on the 6th

I get that some folks who bought the lie are upset now but this is society marching forward, and it’s interesting reading this post, watching it being overrun by extremism


Just do what smart people have been doing for years - stop using big tech. Stop using Google, Facebook, Apple, IBM, Microsoft. Buy a used computer and install a Free OS and just check out of the entire big tech ecosystem.

Otherwise you get what you deserve. And let’s all disabuse ourselves of the incorrect notion that it’s big tech alone which is ordering censorship. It’s our own governments, who use the many secret laws and intelligence agency relationships (all of big tech is basically In-Q-Tel) to get what they want.

Make it difficult for them. Don’t play nice. Dissent. Stop using PRISM platforms.


I use Linux, discuss political ideas on private, members-only forums, and share memes over Signal. This is an excellent setup... if you want to have conversations with your fellow computer janitors.

This is no way to connect to other people. And these people might also have interesting ideas, ideas I might want to hear before some bot at Facebooks deems them against "community standards" or whatever they call their censorship that is totally not censorship.


> This is no way to connect to other people. And these people might also have interesting ideas, ideas I might want to hear before some bot at Facebooks deems them against "community standards" or whatever they call their censorship that is totally not censorship.

Well, people connected with each other before 2007, so presumably you could be exposed to different, interesting ideas by joining local clubs, churches, or other community gatherings where you have interactions with non "computer janitors"


It's an honorable call but most people can't follow it -- like my dad. They're too entrenched.

It's not reasonable, in a connected world-system like today's, to put all the burden on individual people to instantly switch away from bad providers, no matter the level of entrenchment.

Isn't one important role of government to protect constituents from corporate encroachment? Have we given up on electing governments that work for us?


This right here is it, for the most part. I see a lot of people who lament how terrible the takeover of "big tech" is, but they still use Facebook on a regular basis and don't go out of their way to seek alternatives. It's like complaining that Kraft has taken over your local supermarket aisle while you're buying 10 boxes of name-brand mac and cheese.


Not really. There's a gradient of entrenchment. Your example doesn't work at all if we're talking about deeply-entrenched/natural-monopoly products like electricity or internet access instead of a trivially-switchable product like a brand of mac & cheese or ranch dressing.

The question is where does Facebook fall on that gradient? It's certainly not on the "product on the supermarket shelf" end. It's closer to the middle somewhere.

I gave up Facebook in 2013 and more or less lost any semblance of an ongoing connection to a dozen childhood friends from my home country. Many people aren't willing to give up that type of thing. This isn't whatsoever like swappable supermarket products.


Well, that Mac & Cheese is free cuz you're giving them your phone number so they can call you and sell you a gym membership.


"Whoa, it's the guy in that one xkcd comic!"


I love Randell but he was never right on that issue.

The United States Constitution doesn't own the concept of free speech. The concept existed before the constitution was written, it will exist after the usa falls.

While he is correct, in that somebody arguing a social network site censoring them has validated the first amendment, is fucking stupid.

He is not correct in arguing that all references to free speech refer to the 1st amendment, and the fact that the 1st amendment only applies to government censorship doesn't restrain the broad concept of free speech to only applying to government censorship.


thought experiment: If I have a megaphone rental company and I become aware that one of my clients is using it to walk through neighborhoods annoying people by loudly proclaiming Covid is a Democratic Party hoax and Republicans are fools for not acknowledging Donald Trump lost, can I refuse to rent my megaphone to that person and not be guilty of censorship? I own all the megaphones in town and no one else rents them.


Those who's looking to censor the information for the sake of the greater good:

USA. Little to no censorship (yet). Vaccination rate: 55%

Russia. Mass media is controlled by the state. Vaccination rate: 20%

Hmmmm...


Yes, censorship is bad, but which is worse: admitting a little bit of censorship, or Nazis? Because uh, right now, we have Nazis.

In Germany, if you express Nazi views you will be arrested and thrown in prison. Germany is, according to many metrics, freer than the USA. Perhaps because of its laws that put Nazis in prison.


By the way, the nazis were successful because they used censorship. If you want to support censorship, beware that this is what enabled the nazis.


>Because uh, right now, we have Nazis.

We do not, in fact, have Nazis.

Allow me to re-phrase:

Yes, censorship is bad, but which is worse: admitting a little bit of censorship, or Islamic Terrorists? Because uh, right now, we have Islamic Terrorists.

At least one was real in the last couple decades.


It is an ironclad fact that more Americans have been killed by right-wing domestic terrorists than by Islamic terrorists since 9/11.



Just to point out… 9/11 is close to 20 years ago now.. society has marched forward quite a bit since then. While that date as a cutoff point is fairly arbitrary there has been a substantial and alarming rise to extremist right violence I’m the last 10 years or so and that can’t just be swept under a rug


That date is not arbitrary. It is selected to optimize the metric.


Even if it were true and even if you didn't set the boundary at "9/11", which is a helluva point in time to stop counting, I'm not sure how that's relevant. Your original post is the classic "yes censorship is bad, but [insert apocalyptic strawman]" style used elsewhere on this thread but more broadly in media to justify a technological crackdown on people saying things they don't like.


[flagged]


Not OP but a quick google search can get you there, if you’re lazy here’s one going over some stats

https://www.washingtonpost.com/investigations/interactive/20...

I recently stumbled onto a clip on YT of Tucker Carlson using this term “the radical violent left”. The data is clear.. the right is getting far more violent (and far more violent than this “radical left”), I suspect the media is getting ahead of that narrative and pre-labeling the left what they’re becoming themselves as a method of cutting off any debate. Think McConnell at the start of Obama labeling him the most obstructionist president ever, and them republicans being the most obstructionist. It’s an attempt to destroy any credibility to the evidence by some weird form of um… pre-poo flinging?


[flagged]


I don't know but if you really want to ask us a question, please do so by emailing hn@ycombinator.com, as the site guidelines ask. And please send links to the specific posts so we can find what you're referring to.


Censorship is like Zeno's dichotomy paradox; today the people promoting Censorship as a solution are happy; give it a long enough timeline and the very same people promoting Censorship will be the ones being censored.


Nah, I’m actually totally on board with big tech censorship so long as the government doesn’t act as an authority to fuse censorship across all platforms and instead acts to ensure that cartelized blackballing doesn’t happen I.e. a ban from Facebook shouldn’t cause an auto-ban from Twitter.

Otherwise, I don’t care. Kick me off your platform. It’s your house and you have a right to not share it with me.


> so long as the government doesn’t act as an authority to fuse censorship across all platforms

They literally discussed how they're doing this last week during a white house press conference.


When you consider things like Venmo facilitate social connections and interactions, this spills into payments pretty easily too.

And remember when Google launched Google+? They referred to email as the largest social network of all.. which is why they just started people's G+ connections with their most frequently emailed/emailers.


Yes, that’s why I mentioned it. I am against that.


Did they?


Yes.

> We are in regular touch with these social media platforms, and those engagements typically happen through members of our senior staff

> https://www.newsweek.com/biden-administrations-admission-the...

Psaki also discussed how they're building blacklists of people so that if you get banned on one platform, you also get banned on all of the other platforms.


Nowhere in this article does it support what you're saying.


The first is a direct excerpt from the linked article, which is a high level summary of the press conference. The second is a reference to this

https://youtube.com/watch?v=IwwwRC2xLC0


Okay, she's clearly talking about misinformation as it relates to public health. Do you think it should be up to the individual companies to discern the same dozen or so identifiable sources of anti-vax propaganda that's getting people needlessly killed?


As others have pointed out in this thread, "misinformation" has easily extended to politically inconvenient facts, such as the lab leak theory. The government absolutely does not, and should not, have the authority to censor what it deems to be misinformation. The fact that the government is directly coordinating censorship of a highly political subject absolutely discredits the notion that these are just private companies executing "corporate freedom of speech". It is a flagrant violation of the first ammendment at this point.

Saddam Hussein not having WMDs and the Iraq War being based on false pretenses was considered misinformation under Bush. NSA mass surveillance was considered misinformation under Obama. Climate change was considered misinformation under Trump. You are advocating for a "ministry of truth" to enforce the official government narrative. People need to be allowed to discuss these things, because the US federal government's track record for being a purely truthful, altruistic entity is quite poor.


That sounds dogmatic and short sighted to me.


This is a ridiculous take. You will get censored no matter how much you try to “talk” about it. You’re dealing with people who will nod, say uh huh, and carry on doing what they were doing regardless of what’s written.

Nothing has really been accomplished in the last decade. It’s a one way ticket to surveillance and censorship.


"Crises precipitate change" - Deltron 3030


"I wanna devise a virus; to bring dire straits to our environment"

Groans in 2021


"Only a crisis -- actual or perceived -- produces real change. When that crisis occurs, the actions that are taken depend on the ideas lying around." --Milton Friedman


"never let a crisis go to waste" - someone


That was Churchill.


There is a way to circumvent censorship. Read your news using RSS feeds.

Shameless Plug: I do that and [recommend the best articles](https://UnCensoredNews.us)


It would be workable to designate these social platforms as public spaces, removing liability for speech entirely off of the proprietors and onto the courts if we also implemented mandatory licensing for use of these networks.

You can’t drive without a license and identification. It’s not dystopian to suggest that, as a society, we’re at a point where we need to insist on public places on the Internet and personal accountability for speech.

We can’t abdicate our responsibility to regulate this space to tech companies and pretend that billions of people interacting is somehow magically beyond the point of governments.


>You can’t drive without a license and identification. It’s not dystopian to suggest that, as a society, we’re at a point where we need to insist on public places on the Internet and personal accountability for speech.

ay yo you got a license for that speech?

ya, no, it is dystopian to suggestion people should have to have a license to speak.

And you will never sell me that speaking on the internet and speaking in the public are different enough to justify it.

social media sites have become the new town square. General public speech has moved from afk to on the internet, you can't just ignore this because its inconvenient to your view point.

The same protections we have for speech in an town square should exist for speech on the virtual town square. ie: only moderated by the courts under protection of the 1st amendment.

I'm not going to accept less.


They’re completely different. On the Internet, no one can grab you by the cuff and arrest you on the spot. On the Internet, you can disappear into a fake profile, a pseudonymous account, and create continue to create chaos. On the Internet, Sybil attacks are a real problem in a way that is completely foreign to speaking on your soapbox in the park.

Speech on the Internet is more like a hit-and-run.

It’s dangerous foolishness to pretend otherwise.

But we agree that this speech should be handled by the government and the courts, and removed from the sham jurisdiction of big tech “terms of use”.


More "real names" policies won't do anything about misinformation. People are already mostly accountable for their speech, but since misinformation is legal that accountability doesn't do anything.


> removing liability for speech entirely off of the proprietors

This is already the case. Operators of interactive computer services are not liable for user-generated content.


Alex Jones was able to post disinformation on the radio under his own name for years before he finally overstepped into defaming the Sandy Hook parents.


>The promise of the ACCESS Act is an internet where if you don’t like a big platform’s moderation policies, if you think they’re too tolerant of abusers or too quick to kick someone off for getting too passionate during a debate, you can leave, and still stay connected to the people who matter to you.

Use any number of other communication methods or services that brilliant humans have invented to allow us all to talk to each other within seconds?

How have we reached such technical ineptitude that we can no longer consider other options other than big tech platforms when communicating? As if folks weren't stringing wires up across countries 100+ years before so we could send beeps at each other. Now we're going to write laws so your crazy relatives can talk about their conspiratorial alternate reality because losing their friends list is such a painful burden? I think I'd swallow a bullet before being asked to work on any technical features for this act. I know that's absolutely going to suck and I can't think of a worse place to end up after dealing with the difficulty of learning CompSci and programming.

This is going to be a waste of everyone's time and resources for no good social benefit. Literally go fix a bridge with this time and money and it'll do more good for the world.


Still there are folks that cannot deal with anything outside of Facebook, ie., cannot run a regular browser because they simply don't know how.

We have CLEARLY reached such technical ineptitude.


Ineptitude might be the correct description assuming that one should have a certain level of tech savvy to participate online but should that really be the case? Apple not long ago ran a "what's a computer?" ad because participation in the use of computer powered tools no longer requires an understanding of how those tools work under the hood. While that might be disappointing to us geeks, for most people it's a good thing. I'm glad I can drive my truck to the National Forest without knowing anything about the correct ignition timing. Should an understanding of web browsers be a requirement to use the Facebook app?


Thats the point. The parent comment bemoans this because there are of course other ways to stay in touch. I'm just pointing out that for some folks Facebook is their internet. Given this, are we saying Facebook is now subject to being a utility that can't boot people off?

I mean you won't get stranded with your truck because you made an off hand comment you didn't realize would offend someone right? (Heh unless you're riding in someone else's truck)


A lot of the discussion around censorship devolves to "it's a private company, they can do whatever". While this has already been proven to be a false idea based on the comments from the White House actively telling private companies which posts/users to take down, I would also like to share how things happen in China:

"People in the U.S. seem able to recognize that China’s censorship of the internet is bad. They say: “It’s so authoritarian, tyrannical, terrible, a human rights violation.” Everyone sees that, but then when it happens to us, here, we say, “Oh, but it’s a private company doing it.” What people don’t realize is the majority of censorship in China is being carried out by private companies.

Rebecca MacKinnon, former CNN Bureau chief for Beijing and Tokyo, wrote a book called Consent of the Network that lays all this out. She says, “This is one of the features of Chinese internet censorship and surveillance—that it's actually carried out primarily by private sector companies, by the tech platforms and services, not by the police. And that the companies that run China's internet services and platforms are acting as an extension of state power.”

The people who make that argument don’t realize how close we are to the same model. There are two layers. Everyone’s familiar with “The Great Firewall of China,” where they’re blocking out foreign websites. Well, the US does that too. We just shut down Press TV, which is Iran’s PBS, for instance. We mimic that first layer as well, and now there’s also the second layer, internally, that involves private companies doing most of the censorship."

https://taibbi.substack.com/p/meet-the-censored-matt-orfalea


> While this has already been proven to be a false idea based on the comments from the White House actively telling private companies which posts/users to take down

No, it hasn't. A government office telling private actors what it cobsiders to be misinformation and would like not relayed is not state censorship as long as there is no compulsory process. A private party can certainly use “I trust the government, and they say X is wrong” as an input to their consideration of how to exercise their free speech without it being anything other than free speech.

> What people don’t realize is the majority of censorship in China is being carried out by private companies.

China has and uses compulsory process to ensure compliance with its desired censorship, though.


White House's exact words were "we're flagging problematic posts FOR Facebook that spread disinformation". "FOR" is an important word.

https://caitlinjohnstone.substack.com/p/biden-administration...

On top of this, they have already threatened the companies during the hearings that if they don't take it down, then they will get actioned. Plus Biden even called social media companies murderers.

"Congressional Democrats have repeatedly made explicit threats to social-media giants if they failed to censor speech those lawmakers disfavored. In April 2019, Louisiana Rep. Cedric Richmond warned Facebook and Google that they had “better” restrict what he and his colleagues saw as harmful content or face regulation: “We’re going to make it swift, we’re going to make it strong, and we’re going to hold them very accountable.” New York Rep. Jerrold Nadler added: “Let’s see what happens by just pressuring them.”"

This is literally what makes them state actors. I don't know what more they need to do for you to prove they are state actors. There's also precedent from Supreme Court.

"For more than half a century courts have held that governmental threats can turn private conduct into state action. In Bantam Books v. Sullivan (1963), the Supreme Court found a First Amendment violation when a private bookseller stopped selling works state officials deemed “objectionable” after they sent him a veiled threat of prosecution. In Carlin Communications v. Mountain States Telephone & Telegraph Co. (1987), the Ninth U.S. Circuit Court of Appeals found state action when an official induced a telephone company to stop carrying offensive content, again by threat of prosecution."

"As the Second Circuit held in Hammerhead Enterprises v. Brezenoff (1983), the test is whether “comments of a government official can reasonably be interpreted as intimating that some form of punishment or adverse regulatory action will follow the failure to accede to the official’s request.” Mr. Richmond’s comments, along with many others, easily meet that test. Notably, the Ninth Circuit held it didn’t matter whether the threats were the “real motivating force” behind the private party’s conduct; state action exists even if he “would have acted as he did independently.”

https://www.wsj.com/articles/save-the-constitution-from-big-...

Bantam Books v. Sullivan (1963):

https://www.mtsu.edu/first-amendment/article/378/bantam-book...

Carlin Communications v. Mountain States Telephone & Telegraph Co. (1987):

https://openjurist.org/827/f2d/1291/carlin-communications-in...


There are really 2 options here:

1. Live with the first amendment at section 230 as. That means things stay mostly the same. Big tech will continue to censor too many of the things I like and too few of the things I don't like. And we wait for the pre-internet generation to die off and be replaced so society can progress.

2. Restrict the 1st and force fact checking, fairness, neutrality, etc. This would get us back to a functional media/societal/democratic state a lot faster. But it would require pretty radical change. Also, the biggest losers wouldn't be big tech. They would be Fox news and similar orgs.

I don't see any other option here. I don't see that "regulating" (wtf that means, different things to different people for sure) Facebook but not Fox is productive or fair or possible even really.

Personally, I don't trust the current system to do anything other than beat tech companies with a stick for no reason but that they exist and don't pay enough in "lobbying". I don't see Fox or similar, much bigger propaganda/censorship orgs being touched. Certainly not on a bipartisan basis (and that's what's needed to do more than sneeze in Washington).

So here we are, and I am glad the right and left and (stupidly) fighting each other...

Thanks for reading, you may now downvote these inconvenient truths...


>force fact checking

Its not possible, period. You can remove this option. FB/Twitter/YT all have "fact checkers" and they are neither neutral nor do they get it right. Even if outsourced it still fails miserably. On top of that people should not be told what the "facts" are for complex thinks. It removes the need to think and build an opinion. This is essential because for almost everything controversial there is no way to find the absolute truth. People need to learn what it means to accept that we dont know something and likely never will know what the truth is.


Option 2 implies that there is some truly unbiased entity without conflict of interest that could both be able to handle speech vetting and also be allowed to do it. This is extreme fantasy.


I won't pretend I know what it would look like or how it should work. That said, it's not impossible to at least improve the current situation: removing at least some demonstrably false statements would be a start. Requiring "right to reply" and equal air time to different parties would be too.

Again, I don't really support this approach, I'm just pointing out it is what people presumably want since Option 1 is out of favour and that only really leaves option 2.


https://oversightboard.com/ is trying to do this.


Such a vetting would be (1) extremely expensive and (2) in most cases the decision would "well... We don't know for sure"


The worst part is that people will be told "You don't know for sure" rather than people using their brain and come to the conclusion that they dont know for sure. Which is actually the default for any critical thinking person. And it helps a lot to not fall for extremist views or ideas which are usually presented oversimplified and thus fool people who usually quickly pick sides rather than accepting the default of not know for sure.

Rather than a fact checker maybe an algorithm could find opposing content and present this. That would force the user to make some "fact"-checks aka to think. needed to say that such an algorithms could/would be biased as well.


> Rather than a fact checker maybe an algorithm could find opposing content and present this. That would force the user to make some "fact"-checks aka to think. needed to say that such an algorithms could/would be biased as well.

What do you think of a small social network composed of vetted (and always subject to review) reasonably trustworthy and fairly unbiased but definitely open minded users who decompose, fact check, and deeply debate very small volumes of stories, using a platform more sophisticated than nested discussions with voting? Something with structures like a "points for" and "points against" format, among many other novel (as far as social media goes) features, where "it is not really known for sure" is a perfectly acceptable conclusion, subject to review as new information becomes available?


Almost everything will be "not really known for sure". So the next step to make it useful would be somehow count/vote and get "an X is more likely true than Y". Maybe even add % and now you essentially have "mob-fact-checking". Needles to say that the majority isn't a source of truth, especially not if you have a broad range of topics and a broad range of people so that for any given topic only a fraction of the people have deep knowledge and "the truth" is actually defined by the rest (mob) who dont have deep knowledge about the topic.

What you actually would need is a peer-review like system. Where people familiar with the topic do the fact-checking. But this just moves the problem to another place because someone would need to defined who is familiar with a topic, but without putting people with aliened views together, its just as impossible as the fact-checking itself.

Lastly if we actually would be able to create a working fact-checking system, once that system has been used for one of the long time controversial topics like for example fact-checking a statement about abortion being murder, then almost everyone who disagrees with that fact-checking would loose trust in the system which render is essentially useless. You now have a "source of truth" but a significant portion of people (roughly 50% probably) don't trust it.


If I'm hearing you correctly, it sounds like you're kind of setting a goal of reaching something like "X is True"? My thinking is, very often (usually?) the best that can be done is to decompose X down to as fine grained sub-components as possible, and then tentatively flag things with True/False/Unknown...or maybe even something more fine-grained than that ("Seems True", "Seems False").

From my perspective, the biggest issues are that we refer to issues with extremely ambiguous names/perspectives, and we assert true/false where it is absolutely not finalized. I believe that if there was a system run by a transparent, independent organization that took the definitions and the epistemic status of these issues very, very seriously, some people would start to have some trust in them, especially if they developed a track record over time.


>...a goal of reaching something like "X is True"?

Not really, It would serve no purpose, only the people who agree with it will agree with the system that outputs "their truth".

I dont think the goal should be such a system because I think is will be flawed no matter what and also because I think people should be triggered to think and not feed with simplified "facts".

People would also do better if they stop caring about a lot of "garbage" facts they get feed every day if they would actually need to think and build an opinion on something.

Instead they get a opinion presented and either take it or reject it based on bias mostly. This is not useful. It would be better if a person who's not sufficiently interested in the topic simply does not have an opinion on it. At lest if at a later point he gets interested he would then not be preoccupied by past "copied/rejected opinions". Many people today have very strong opinion on very irrelevant topics and can hardly reason their stance because it dint "grew" in them it was mostly planted/absorbed from media.


> Not really, It would serve no purpose, only the people who agree with it will agree with the system that outputs "their truth".

Does this not assume that people's minds cannot be changed? I understand the generalization you're making and very much agree with it, but I suspect we differ greatly on the underlying causality.

> I dont think the goal should be such a system because I think is will be flawed no matter what...

Is "perfect is the enemy of good" relevant here, and perhaps also "perception is reality", and some others?

> and also because I think people should be triggered to think and not feed with simplified "facts".

100% agree. A proper system would have numerous goals and features, I imagine you can think of many that I overlook (despite how much more time I've spent thinking about this problem).

> People would also do better if they stop caring about a lot of "garbage" facts they get feed every day if they would actually need to think and build an opinion on something.

This is a fine idea - how might one cause (force) such ideas to manifest in physical reality?

> Instead they get a opinion presented and either take it or reject it based on bias mostly.

Under the current system, agreed.

> This is not useful.

That depends on one's perspective, goals, etc - it is immensely useful to some people.

> It would be better if a person who's not sufficiently interested in the topic simply does not have an opinion on it. At least if at a later point he gets interested he would then not be preoccupied by past "copied/rejected opinions". Many people today have very strong opinion on very irrelevant topics and can hardly reason their stance because it dint "grew" in them it was mostly planted/absorbed from media.

Agree again - so, what can be done to alter this state of affairs? What is the most efficient approach that can be devised and implemented (and, how might one go about that)?

The lack of systems & logical thinking on HN when it comes to certain topics is an extremely interesting phenomenon to me, what do you think?


> force fact checking

Just remember that less than a year ago it would have been your worst enemy appointing the fact checkers.


Nothing is "inconvenient truths" from what you said, you've just showed yourself as an ignorant fool, that's all.

At least try and read half the article next time before engaging in conversation.


I didn't downvote but I read twice and I don't understand what you're saying. Maybe you could try and rephrase to make your point clearer?


* Problems with fake news, censorship etc are much bigger outside of big tech than inside (media monopolies and propoganda like fox news for a start).

* so if you actually care about those issues (rather than just hating big tech) you need to fix things beyond big tech

* that means reform of the 1st amendment

* I suspect that no one actually wants that, they just don't like Facebook etc but they're used to Tucker Carlson or Joe Rogan

Is that any better :)


Yes I understand now thanks!


No worries. I was pretty wordy the first time so it was good for me to summarise. :)




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: