I agree with your assertion. And I would add a bit on the "content moderation is difficult" argument.
The current state of content moderation - or lack thereof - on these platforms didn't happen by accident. It's the direct consequence of their business model.
If you create a digital space and you host millions of people, you will also pull in the fraught complexity of human relationships. The digital space isn't free from the same challenges regarding governance of communities you'd encounter in the physical world.
However, the businesses that provide the infrastructure and engineering that underpins digital spaces aren't in the "building communities" business. They are selling advertising and business intelligence. Which is a very different proposition.
As such, the very design of the large platforms never quite incorporated proper tooling that models the complexities of (self) governing millions of people in a central digital space. On the contrary.
Recommendation engines - people who you know, people to follow, this might interest you as well, this article was shared x times,... - do the exact opposite. They have helped shape emerging social dynamics which are spilling over in the real world, with real world consequences, creating social feedback loops that escape control.
Rather then acknowledging this fundamental issue, these centralized platforms assumed that they could solve this through centralized content filtering, or through centralized human moderation teams which have to follow company policies. Which is an incredibly difficult proposition considering the amount of digital data produced daily by billions of people.
So, why did they chose this type of content moderation? Because the social issues that stem from these platforms are seen as an expense which ought to be externalized. There's only an incentive to invest in content moderation to the point where the issues caused by the people they host are threatening the performance of the business model.
In that regard, I think that a "social media market" were platforms provide affordances for equitable (self)governance of communities involves a fundamental reflection/reshaping/rethinking/overhaul of the business models and business incentives that underpin such a market.
The current state of content moderation - or lack thereof - on these platforms didn't happen by accident. It's the direct consequence of their business model.
If you create a digital space and you host millions of people, you will also pull in the fraught complexity of human relationships. The digital space isn't free from the same challenges regarding governance of communities you'd encounter in the physical world.
However, the businesses that provide the infrastructure and engineering that underpins digital spaces aren't in the "building communities" business. They are selling advertising and business intelligence. Which is a very different proposition.
As such, the very design of the large platforms never quite incorporated proper tooling that models the complexities of (self) governing millions of people in a central digital space. On the contrary.
Recommendation engines - people who you know, people to follow, this might interest you as well, this article was shared x times,... - do the exact opposite. They have helped shape emerging social dynamics which are spilling over in the real world, with real world consequences, creating social feedback loops that escape control.
Rather then acknowledging this fundamental issue, these centralized platforms assumed that they could solve this through centralized content filtering, or through centralized human moderation teams which have to follow company policies. Which is an incredibly difficult proposition considering the amount of digital data produced daily by billions of people.
So, why did they chose this type of content moderation? Because the social issues that stem from these platforms are seen as an expense which ought to be externalized. There's only an incentive to invest in content moderation to the point where the issues caused by the people they host are threatening the performance of the business model.
In that regard, I think that a "social media market" were platforms provide affordances for equitable (self)governance of communities involves a fundamental reflection/reshaping/rethinking/overhaul of the business models and business incentives that underpin such a market.