Hacker News new | past | comments | ask | show | jobs | submit login
YouTube, the Great Radicalizer (nytimes.com)
75 points by pulisse on March 10, 2018 | hide | past | favorite | 49 comments



It's not youtube. It's not facebook. It's not twitter. You can write this article about any company these days.

It's a deeper fundamental devil's cocktail of monetized attention + modern (intensely and immediately metrics driven) product development + our buggy brains. Any company who wants to make money from the attention of human beings will find the same extraordinarily rapid pull to extremism. That's what we click. Our brains love these extreme things. The companies are just molding themselves to our brains in real time.

So much of public discourse these last few years has been pulled towards these most base human bugs. Fear of the "other", violence, tragedy, banal comedy, sexual deviancy, cute shiny this or that, inspirational platitudes. Even worse, the content production itself is now being hooked up directly to the metrics creating a machine learning feedback loop spinning out of control.

Any company who wants to fight against this needs to sign up for public, immediate, and painful metrics hits. We saw this happen so directly with facebook last quarter. These problems are so fundamental that if one company tries to fight against it and turn back the dials, another won't and the company attempting integrity will flat out start to lose. Our best shot is to create completely different incentive structures. In the current overarching architecture of media and technology, this hell is the clear winner.


I agree with you but if anything is going to be done about it we need to keep youtube, facebook, twitter, reddit, etc accountable.


Accountable for what? We have a disastrous dietary culture in this country, a never ending bomb of acute illnesses that waste trillions to merely mitigate symptoms that could have been avoided with better information.

An obese nation literally has the quality of cognition tuned down. Persistent inflammation, insulin resistance messing with the brain, eventual diabetes...

Instead of making people into strong kingdoms, we give more advanced tools to kingdoms of the body and soul blighted by people who think ideals matter more than diet and habit.


Well said. I wonder however what makes these companies different from traditional information outlets like newspapers. They also compete for attention but they seem to maintain a more balanced view of the world. Is it because they don't have the possibility to individually target consumers?

Edit: What I did here was rubber duck problem reasoning. I realised this: Some tabloids are also quite extreme. They target a group of consumers. They offer a sensationalistic view of the world. The consumers put themselves in that group by buying the tabloid.


Wonderfully said. Though, I think Facebook's efforts this quarter are worthwhile. They have billions of users and a near unfathomable fraction of the world's people logged in. When they turn back the dials, are they really more likely to lose than to make a difference in the world?


> It's a deeper fundamental devil's cocktail of monetized attention + modern (intensely and immediately metrics driven) product development + our buggy brains.

You know, radicalization and retreat into conservative values has happened literally hundreds of times in history. There have been many "causes". Military confrontations, even accidental ones. Resources. War. Disease. Large-scale accidents. Assassinations. Drugs. And now, of course, "social media".

However if you go and examine those periods, the same thing is always apparent. These things always happen after the end of a period of economic prosperity and at the start of a period of economic regression. Many people, especially those with an agenda, say that the economic damage is being done by extremism and retreat into conservatism, but upon closer examination it becomes clearer and clearer that the economic damage came first, then the retrenchment, then comes the extremism.

Now I get it, this is the 100th or so time that we see this regression happening, and once again you see very clearly very large scale economic events immediately precede it: the "great financial crash" of 2008, which really ran from 2008 well into 2011, and for many people there has still not been a recovery. And that's in the US. For quite a few people outside of the US, is hasn't even stopped crashing. In many places 2008-2018 was a lost decade. A decade that was spent on nothing other than damage control.

I get that all these new things appear to be playing a big role in people's lives and therefore "must" be responsible, but they're not. They don't matter. They are the "big thing" that's different when comparing 2018 with 2007. Sure. I'm with you. But what about the 70s versus the 50s and especially 60s ? Social media ... what role did they play ? How about the late 30s and 40s in europe versus the 20s and early 30s ? Can't get much more extreme than that, and yet, I don't see Facebook's fingerprints anywhere. And all of those pale in comparison to what happened at the end of the 19th century.

At which point do we say that, no, is't not youtube, facebook, twitter, or companies. It's that the government is unable to keep the economy growing, or perhaps we should say unwilling. For instance, people being forced out of the Bay Area because for their education no opportunities exist and they haven't landed a job paying half as much as the one they had until 2009, they're going to be resentful and maybe even extremist. This economic marginalization is happening everywhere.

Now I will say, social media and youtube are making it clearer quicker that this is what's happening, but they're not a cause. Completely destroying them would do nothing but slow it down slightly.

But the real problem the government sees is the need to start a big anti-opium program. Not directly opium, or drugs, but they want to prevent Americans from taking pain medicine. Not, of course, by providing the care that would take away the pain, but simply by taking anti-pain measures away from them, making them expensive. Of course, nobody sees the obvious result coming: more extremism, more alcohol abuse. If you've got a hernia that you can't get treated because it's too expensive, no amount of good intentions will get you off pain medication. And if you can't get an opiate or pain treatment (or better yet: an operation), you WILL use alcohol. Especially in a cold climate. Willpower will delay this, by a few months, but no more.

It's the economy, stupid. What's happening is that the overcapacity built in China since 2003 or so has been overwhelming the world economy and is forcing this move backwards. Trump is doing more to counter it than Hillary would have, but even he is merely tinkering at the edges.


Or maybe people are naturally more attracted to the extremes (left and right) when given the choice. Outlets such as the NYT or their right wing counter parts have historically filtered out extreme views. What if youtube was just giving people what they want and the NYT was the de-Radicalizer? For better or for worse. Also note that today's mainstream pro-democracy/social democracy views used to be regarded as radical a couple of centuries ago or just today a few thousands miles away.

Radical ideologues by virtue of not holding any power are devoid of corruption compared to mainstream politicians and for this reason always attract the crowd when they (rightfully) denounce the corruption of the current power that be. Not that they would be just as corrupt in their place.


> Or maybe people are naturally more attracted to the extremes [..] What if youtube was just giving people what they want

That's part of it, but I think the point is more subtle. YouTube shows the content that will keep people on YouTube (to maximize advertising), but that doesn't mean it's what they most want.

Given good or bad news, we may prefer the good but spend more time on the bad, even if the bad news is something that's irrational to worry about.

The way to get people to spend more time on YouTube may be to make them worry more, care more, and overall be more involved in the topic by showing more extreme aspects of it. That may be more exploiting human psychology (perhaps unintentionally) than giving people what they want, or maybe it's giving our darker sides what they want.


>[not] giving people what they want

>giving our darker sides what they want.

This almost sounds like people have almost no agency or say in it.


These sites are designed to remove as much agency as possible, so that users are driven to behave in predictable, repeatable and easily monetizable patterns.

The question is, how much agency can they actually remove? If someone uploads "radical" material and it's popular with other "radicals," then is exploiting that any worse than doing so for mainstream content?


I don't agree with the way you phrase it.

No freaking suggested video is going to rob any individual of their agency any more than yet another hottake NYT Op-ed is going to rob mine. Their algorithms however do move the average, increasing monetization. The average is not the same as the individual.

I don't see that as removing agency, even if it is influencing individuals a little. There are dark patterns like those employed by shady mobile game apps that hook in players into addiction. The recommended vids on youtube don't reach that level in my opinion and it's a category error to lump them together.


giving people what they want

This implies that people have static, immutable sets of preferences. Do you believe that?

Just curious. Were there any foods you hated as a child but now like? What about musicians? Movies? Books? Art?

I ask (somewhat acerbically) because I see this argument so often and I struggle to see how anyone would believe that humans have static preferences over time. The natural corollary of humans having mutable preferences is that the entire discipline of marketing is geared toward discovering and shaping preferences for profit.


I'm not seeing anywhere in the parent comment that implies that people's desires are static and immutable. On the contrary it points out that, "today's mainstream pro-democracy/social democracy views used to be regarded as radical a couple of centuries ago" to provide an example of changing preferences. The dynamic nature of people's desires is part of why I think Youtube does so well. It reacts very effectively changes in what's trending.


> Or maybe people are naturally more attracted to the extremes (left and right) when given the choice.

If by given a choice you mean people are usually not given a choice, but in mid-1600s England, late 1700s France, mid 1800s USA, and most of the world in the early-mid 1900s, then yes. Or there can be changes in the forces of production and its various superstructural elements that cause this to happen. People are drifting away from the center and status quo in Greece, but unemployment was at 28% in 2013, and is still above 20%. Thus many people are boun bouncing between anarchism, fascism, and communism. The current prime minister is head of a party of what were called in the 1980s "euro-communists" - people on the right wing of the communist movement (the KKE are the left wing communists).


I think you're using "corruption" in a way that's not really justified. All sorts of radical ideologues build self-serving cults of personality that provide them with a sustainable grift. That is, the ideology is not something they really live in any deep way, but rather is just a means to lining their pockets. That seems just as corrupt to me as any politician on the take.


Some examples I've noticed personally:

* Clips of Bill Burr on Conan O'Brien > Men's Rights / anti-feminism videos

* Animal videos > Creepy channel where someone makes various mousetraps and sees how many kills they get a night

* Videogame longform critiques > Pro-Gamergate / Men's rights

* Movie analysis video essays > Video essays on how SJW's are ruining Star Wars / Marvel

* General psychology / self-help > Clip from a ho-hum Jordan Peterson psychology lecture > "Oh, you watched one thing featuring Jordan Peterson, you must agree with his more extreme views. Have some videos about that."


I've been noticing a similar phenomena. No matter what news clip I click on, my recommended videos will always be filled with right wing videos. So if I click on a clip of fox news I will get recommended some clips from say Ben Shapiro (expected). But if I click on a clip from msnbc, cnn, the young turks, or even Democracy Now my recommendations are still filled up with right wing content (undesirable). And unlike the left wing videos that might be recommended to me on a left wing video, the right wing videos will follow me around the site no matter what content I watch.

I wonder if an over reliance on neural networks is what has been causing them so many problems recently.


Its quite sad that people consider videos about "Men's rights" a radical view.


"Men's rights" is often a code-word for "radical misogyny".


This sentiment is exactly why communities focusing men's rights end up revolving around "radical misogyny".

You start discussing something like paternal rights or men's mental health. Because it is about "men's rights", people denigrate it as about "misogyny". Those who actually care about equality, distance themselves from the group, misogynists move in, and the group is either destroyed or becomes a hive or "radical misogyny". A self-fulfilling prophecy.


MRA is a few good points wrapped in a seven layer dip of paranoia, sexual entitlement, harassment, and misogyny.

It's 'not radical' in the same way that hang-all-the-bourgies anarcho-communism is not radical - that is, not at all.


Most people's radical view is simply a view different from their own.


We wouldn't want gender equality to be too equal wouldn't we.


> General psychology / self-help > Clip from a ho-hum Jordan Peterson psychology lecture > "Oh, you watched one thing featuring Jordan Peterson, you must agree with his more extreme views. Have some videos about that."

I find Jordan Peterson's views quite anodyne. I suspect your opinion that he has "extreme views" is based on hearsay and not on anything you actually heard him say or write. Of course, I could be proven wrong by a link to some actual extreme statements he's made.


[deleted]


>Most of the things he says are extreme right wing stances

Again, I don't believe you've actually heard or read any extreme things he's said. I think you're just following the herd. (Or the "heard").

It's like with James Damore. Many people claim he said things in his memo, like for example, that woman are genetically unsuited for software development, that he absolutely did not say, nor could any reasonable intelligent person infer it from what he wrote. Yet many people here on HN, whom I would presume are of above average intelligence, continue to make that claim.

I don't believe that you have ever read or written a documented quote of Jordan Peterson that a reasonable person would consider extreme. You can prove me wrong, however, with a simple link.

The GP did not provide any substantiation for his claim when I challenged him to, nor do I suspect will you.

How about it, faizshah?


[deleted]


>If you believe that so strongly that you can assert I have never read anything by him, then we have nothing to discuss. The argument isn't winnable.

faizshah, you're making absolutely no sense here. I asserted no such thing. Here are my exact words.

"I don't believe that you have ever read or written a documented quote of Jordan Peterson that a reasonable person would consider extreme."

>If you just take for example his views on race and IQ, he holds a classic far right wing view that no reasonable person would call a moderate, centrist or left view.

That is a ridiculous, false, and completely unsupportable statement. Which is why you offered nothing to support it. Pure hand-waving.

>The Damore memo is another great example, defenders will never accept that Damore's key argument was that "the distribution of preferences and abilities of men and women differ in part due to biological causes and that these differences may explain why we don’t see equal representation of women in tech and leadership"

To the contrary, I will indeed accept that is his main argument. It may be right, or it may be wrong, but it's a perfectly reasonable hypothesis. And now Google, and the left in general, are the Catholic Church to Damore's Galileo. It's not enough to disagree with him, and they will not engage him in debate. They have to punish him for his heresy and bar any and all discussion of his ideas. (E.g., every story about Damore gets quickly flagged into oblivion here on HN.)


> Most of the things he says are extreme right wing stances on issues

By what means did you come to that conclusion? Is there some data that breaks down a list of his views and the weight on the left/right spectrum?

Are his extreme views his own construction, distortions of research, cherry picked data, or real data that is simply not agreed upon.


A recent Corbett Report piece on YouTube and alternatives such as Bitchute and Dtube, given the recent purge and '3 strikes' 'community policy' constraints on commentaries unpopular with 'moderators'.... https://www.corbettreport.com/youtube-is-now-themtube-time-t...


I think, in the technical sense, radicalizer is less apt than "signal amplifier". It seems youtube is good at identifying a single, perhaps too-specific theme and focusing on it. So "jogging" leads ultimately to extreme marathons, instead of broader based "fit and healthy" content. So maybe it needs to optimize for amplifying a wider band, but to a lesser degree.


This is a problem that's inherent to many recommendation systems. Viewing history feeds into recommendations, and recommendations drive viewing habits -- so there's a feedback loop. Depending on how the recommendation engine is tuned, this can result either in recommendations that drift (e.g, leading users to "the weird side of Youtube"), or ones that hyperfocus on one topic (as described in this article).


I've been watching "hbomberguy", who is doing a good (and entertaining) job of dismantling the claims of the YouTube right-wing. https://www.youtube.com/user/hbomberguy/videos


Another great channel in the same area is "Shaun", he does a great job of breaking down and debunking the claims of a lot of popular right-wing channels. https://www.youtube.com/channel/UCJ6o36XL0CpYb6U5dNBiXHQ/vid...


While we're recommending leftist youtubers, I think ContraPoints[0] is wonderful, educational, and entertaining. She has made me reconsider a lot of things.

[0] https://www.youtube.com/user/contrapoints


There's a simple solution. If YouTube recommends a video you're not interested in, click on the three dots to the right of the video and select 'Not Interested'. It then gets removed from your recommendations. Occasionally I've had to do it more than once for it to stop recommending similar videos, but otherwise I can confirm that the YouTube algorithms quickly get the message.


I like the article,l and the first thing I thought was the same as you; To manually help the recommendations algo by assigning 'not interested' tags BUT leaving this as standard routine for billions of users is not a effective option and has indeed measureable affects on society while misusing human behavior to collect money from advertisers. As it boils down again in the root of the money system behavior, to manipulate and collect money. It is not intended to do so.

I like the article to just acknowledge the fact as status quo. Understanding the system means to deliberate from it in the first place before asking how anything can achieve a better purpose for society in general.

On a deeper level I guess it's even more about the understanding how our subconscious mind is modeled in society and how we unconscious select information in our daily activities. I mean the whole mass and every individual itself.


I'd suggest the main challenge is in breaking out of echo chambers. It's healthy to engage with people you (respectfully) disagree with. Not doing so can easily lead to misinformation being spread amongst like-minded people.


The problem is not videos being recommended to me, it's other people being radicalised by the videos they are suggested.


We should all take personal responsibility for the media we consume. It's not up to Google to shield people from less nuanced political views just in case we take them too literally.


Seems to me that it's just easier than ever to fall into an extreme echo chamber, regardless of what side that is on. Pick whatever flavor you want, you will find a community for it online with all sorts of supporting material. It's a natural consequence of everything being open and available for consumption, it's pretty hard to have just the good without the bad.


For those interested in the history and process of radicalization, I strongly recommend David Neiwart's book "Alt America". [1] He's a journalist who spent decades covering the "Patriot" fringe in the US, which often had elements of white supremacy, conspiracy thinking, anti-government paranoia, and other nuttery. It gives him unique depth on how the Internet, for all its benefits, also made it much easier for political extremists to connect and organize.

[1] https://www.amazon.com/dp/1786634236


If you have kids at home, I strongly recommend blocking YouTube. We did this about a year ago after our son shared that his friend had found lots of disturbing videos.


If the algorithm is directing people to more 'extreme' content for a given topic, does that imply it has some latent notion of what is less extreme? In order to draw a vector from one point away from the center, it'd need to have some concept of where the origin is.


Western society already went through this. Precursors of copyright law were spurred by the invention of the printing press. People who might use the press to spread unapproved politics or theology were to be denied access to the new technology.

The authoritarians lost.


The radicalization of the mainstream media is far more scary to me than some random people making pepe videos (which, although I would consider myself conservative, never get recommended to me by youtube). White men = evil. Read all about it! It's okay to say it now -- just accept it. White people are fucking scum of the earth. It's time that we finally DID something about it: source: NYT, Washington Post, Democracy Now, western universities. I guess its time for western civilization to collapse already.


Conspiracy theories and disagreeable speech are still speech and are still protected, I understand not wanting to run ads against that content but advocating to memory-hole it altogether is the morally wrong argument.

People will choose what to ultimately watch and spent their time on, this insistence on shielding and infantilizing the general public is patronising, not to mention that there are no studies that would indicate any of the content is changing anyone's opinion, the few scientific studies on this "fake news" scare would suggest that mainstream outlets were and are the shapers of public opinion e.g. https://cyber.harvard.edu/publications/2017/08/mediacloud

The attacks on youtube are especially alarming with the scope of the censorship demands ever increasing with every new feature being blamed, first it was about running ads on obscure videos and the channels of "controversial" personalities and now they demand that the viewer not be suggested content that the author thinks is objectionable as though they are forces to watch it "A Clockwork Orange" style. I'm old enough to remember when this author and her peers were defending web firms against censorship attempts by governments around the world.


And the old media was a propaganda machine. And a total waste of cognitive frames. How I wish I could trade all the stupid ads I saw for a memory of a few seconds with my wife.

If people are being radicalized, it signals that maybe the system of incentives have been gamed. We live in the shadow of another housing bubble. We live in the shadow of a pension bubble. We live in the shadow of possible emergent dystopias, whether they be left-aligned or right-aligned. And we used to live in the shadow of a massive propaganda machine. This type of extremism is borne of the shock of hearing good actionable information over bad information forged with bad faith.

Imagine growing up and you find adults, actual people with actual responsibilities, act like children when it comes to politics. High school teachers making fun of George W Bush as if he was a total idiot. That was very bad information. He might have squandered blood and treasure on Iraq but he managed his image well, even coming off as "a simple down to the earth man" before his supporters. Down to his accent he adopted despite growing up in old WASP holdouts. Nope,instead grown adults preferred to amuse themselves with an image of a total idiot president. Which was both disarming and catastrophic to understanding how power works in this country.

Surely there is a "center" in this country but the "center" gave us unmanageable time bombs of massive liabilities bequeathed, "awarded" one might say, as burdens for future generations.

As far as Youtube goes, it's not proper to blame the failings of ordinary people. If anything, it's our useless public education apparatus that let bad information, bad faith, and bad habits become a triumvirate incarnated. So much cognition wasted so a class of bureaucrats, school administrators and clerical staff, could collect pensions that future generations will be paying very dearly for.

That's my two cents rant.


> As far as Youtube goes, it's not proper to blame the failings of ordinary people.

They're making money by abusing our weaknesses. That's what happens when you optimize for attention and ad impressions.


You act as if we were helpless. You're just seeing the ugly side of "average" people. How quick they are to form blobs around novel information. How lacking in critical thought most people are. It sounds almost absurdly cynical but my experience has made me a cynical person about most other people and their ability to discriminate further than the two categorical blobs of "right" and "left".

Not to mention that focusing on YouTube is a minor thing. Especially when the quality of cognition can be radically changed with diet and habit. Imagine all the people drinking milkshakes from Starbucks, enjoying that sugar and caffeine rush. Remember, it's not a coffee company (profit wise), it's a milkshake company that blends in coffee.

Look at sugar content of their items.


And to add, they're even worse than milkshakes. At least there's fat content which slows insulin resistance.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: