Hacker News new | past | comments | ask | show | jobs | submit login
[flagged] YouTube Recommendations Lead to More Extremist Content for Right-Leaning Users (ucdavis.edu)
29 points by grammers on Dec 15, 2023 | hide | past | favorite | 40 comments



Apparently this is a thing on Audible as well, as I learned recently from Penny Arcade

https://www.penny-arcade.com/comic/2023/12/01/algo-rhythms

> Gurb turned me on to a kick-ass book called "The Mysterious Case Of Rudolph Diesel," and I think you should read it if you're interested at all in the world, but you should buy it with cash in a town you don't live in and read it in a dimly lit cavern. Because if you don't, if The System finds out you read a book about a fascinating historical character and his mysterious disappearance, you'll be clocked immediately by their tendrils as… whoever this is.


I enjoy the comic and just about everything else everyone at PA develops, but it’s Holkin’s thrice weekly displays of absolute word sorcery that has kept me reading since 1998.


This is not at all surprising. Make a brand new account and watch a few welding or machining videos. You'll be getting PragerU, Daily Wire and Tucker clips in no time. It goes downhill quickly from there. The targeting is pretty explicit.


> The targeting is pretty explicit.

"Targeting" isn't the right word, "statistics" is. These are statistics based, data driven, mechanisms that do their best to find things that give the best chance of you positive interacting with them, with positive meaning money going towards the people behind the algorithm.


"Targeting" is the right word, because many people with no prior interest in extremist topics get fed them by the algorithm. As long as even a small percentage of those people get hooked by that extremist content and end up spending vast amounts of time on youtube sucking it up the algorithm did what it was created to do, but that doesn't mean it wasn't absolutely targeting a group of people.

It might be "statistics based, data driven, mechanisms" that cause a fisherman to bait a hook, but he's still out there targeting fish and waiting to see which ones bite.

For the record, I don't buy the argument that youtube's algorithm can radicalize most people. For example, most people who aren't Nazis could listen to Nazis talk all day long for years and never start hating other people because of their race. People who study/track hate groups do exactly that without issue. No amount of listening to Nazis will suddenly turn them into one.

The problem is that there exists a small number of people who are genuinely vulnerable to that stuff and other people who can be influenced by some extreme ideologies after being desensitized by floods of even more excessively extreme ideas. When you're exposed to an endless steam of totally insane views, the mildly crazy ones can seem saner than they would have otherwise.

I don't think that youtube should censor content out of fear that some people will be vulnerable or desensitized to extremism, but it'd be nice if youtube didn't explicitly push it on people who never asked for it in the hopes that some of them will get suckered just because that drives up engagement.


Daily Wire isn't particularly 'extreme' or 'conspiracy theorist'. I am less familiar with Tucker Carlson but while I find his takes on Russia distasteful, I haven't seen any extremism.


Extremism is relative to where you are on the political spectrum. That's why far right people call even the most modest social program 'communism' and why the far left would call even minor criticism of a social program as 'fascist'.

You haven't seen extremism in from those media sources because relative to your perspective, they aren't.


I think that’s a common behaviour, but also an incorrect one did I do not indulge in, nor have I given you any reason to think that I indulge in.

Extremism is the advocacy of violence as a solution to political problems. Objectively I have not seen The Daily Wire call for genocide because it has not happened.


What I was saying I feel applies to everyone, you and me included.

'extremism', 'advocacy', and even 'violence' all have subjective definitions dependent on nuance, context, and subtext. Exactly what you (and me too) perceive these to mean will depend both on us and what is being discussed. That's just the way it is. There is no absolute objective viewpoint when it comes to human politics.


In some ways I think this is a tricky problem since you want users to get deeper into some topics, but not ones that are considered "problematic", but defining those is inherently political.

It seems like you could define some idea of "depth" into a topic (based on how far out of normal viewer's patterns it is), and only generate recommendations for items that aren't far outside of the norm, but this would lead to a lack of depth for recommendations in niches.

Maybe a middle ground would be to treat sensitive topics differently in terms of "vertical" recommendations, by e.g., explicitly marking some categories as safe and enabling recommendations to go deeper, but only allowing "horizontal" recommendations for unknown topics, and maybe preventing recommendations "into" that topic from the outside.

So... if you're watching train videos you might get to see even more niche ones, but welding won't recommend for you fox news, and watching fox news won't show you Alex Jones recommendations.

I pick on the right here since it's in the topic (and I'm left leaning myself), but I think radicalization is an issue on the left as well (though frankly my political opinions make me think it is less impactful there, mostly because of the way people radicalize on the left I believe tends to impact less marginalized people or be in terms of policy rather than affecting people that are already beaten down).


While left leaning users are only presented with a healthy selection of diverse and well argumented videos expressing a panel of perfectly reasonable view points.


> healthy selection of diverse and well argumented videos

That’s what left-wing extremism is.


This seems a bit suspicious because there is a trend of defining right wing content as extremist. And I'm not interested in whether something is classified as a conspiricy so much as whether it is true.

I'[ll pick on Ivermectin through COVID as an interesting case. Now, obviously, if you have 2 groups and one has parasites but the other doesn't then the parasite-free group will get better COVID results. So as expected, people treated with Ivermectin got better COVID outcomes.

It took a long time to get the message out to explain that effect because in the spheres I listened to everyone who pointed out the statistically significant result got shut down with logical fallacies. Conspiracy theorist was definitely one.

I'd rather be completely correct, but I'm happy to fall for the occasional conspiracy that is backed by statistically significantly evidence. People who fall for that sort of mistake are going to get better results long term than people who ignore evidence. But this study would classify that sort of evidence-based reasoning as a right-winger being led into extremist conspiracy content. I mean, I dunno. A branch of the right wing believes in looking at primary evidence. That means they get things wrong, and sometimes right, in ways out of sync with the mainstream conversation.


The fact that the idea of "conspiracy theory" survived the Snowden reveals shows how brainwashed/forgetful people can be.


Why wouldn't it? It's not as if the fact that some conspiracies are real means that they all are. There are plenty of loons who have all kinds of crazy conspiracy theories many of which have been proved wrong like satanic rituals in the non-existent basement of a pizza shop or the Earth being flat.

As long as there are paranoid people believing in crazy things without evidence (and often in opposition of actual evidence to the contrary) and others taking advantage of those people, the idea of "conspiracy theory" will flourish.

We just have to understand that not all conspiracy theories are equal, and that legitimate concerns can get dismissed as being a "conspiracy theory". We'll always have to evaluate theories involving conspiracies according to the evidence we have and decide for ourselves how likely they are and which ones are worth our time/energy.


Jeff Epstein too if we are going for specific examples. It is kinda tricky when conspiracy theorists are saying that the elites are a bunch of paedophiles and the evidence is there is rampant accepted paedophilia afoot in the elite circles.

I suppose the conspiracy part is would guessing how common it is?


True for left-leaning users too, of course.


The whole takeaway of the study is that the effect is significantly more pronounced on the right. More categorical details are in the supplementary material.


Excellent, the algorithms provide what the customer wants. It's not YouTubes place to be a nanny.


Is this missing an /s? YouTube is literally the nanny that wrote the algorithm.


Yep, everybody that watches welding videos wants to be fed right-wing propaganda.


Statistically speaking, right-wing propaganda is far more popular than welding videos, so of course the algorithm is going to push it to anyone and everyone. Especially if there is any overlap in interests or subscriptions between the two groups.

The actual goal on the part of Youtube is not political activism but engagement for ad views. Polarizing content achieves that for people who want it and people who hate it.


Seems like free market is operating correctly. People on right get right wing content and people on left get left wing content.

As I am absolutely sure it also leads to extreme left wing content.


I realize this is a contentious topic, but I don't think it necessarily deserves to be flagged.


True for Twitter as well.


The journey to the dark side starts with some interesting Joe Rogan video, that takes you to Jordan Peterson, then Matt Walsh and ends with Stefan Molyneux, Lauren Southern and Alex Jones. That is the bottom of the YouTube iceberg. Below that point videos get mass reported and taken down.


I'd love to know what the "left wing" equivalents to the above are. Looooots of comments in here claiming the "same thing" happens with left-wing content, but I'm not aware of extremely popular left wing outlets that lie about literally everything they report on the way Alex Jones, PragerU, and the like do.


It’s either misdirection or they think things like the NYT or CNN are ultra far-left. YouTube is REALLY reticent to recommend LW content. I just opened an extremely LW video and the recommendations are stuff like Piers Morgan (with Andrew Tate), religious stuff and 'WOKE [x] is humiliated' sort of chaff.


Gravel Institute, Thom Hartmann, Sam Seder, Richard Wolff, The Young Turks, etc


> I'm not aware

This public admission is a great start; keep going.


This is true for most algorithms in user created content sites. Also, this isn't exclusive to right-leaning, it's the same for left-leaning, and other types of content all the same. It's just how algorithms work.

The real question should be, should we prevent this type of content from getting recommended, and where are the lines?

Has a side note, I'd love to see a Twitter style Community Notes be implemented on YT. It's the one good feature Twitter has implemented in a long while. And yes, YT has Notes, but they're done by YT themselves (the COVID ones for example).


I'd genuinely like to see what moderate right leaning content is even available for consumption. The only thing anyone seems to talk about anymore is the grifters and lunatics.


Plenty of moderate right content with regards to education concerns, government spending / taxing, and company regulations.


I suspect that the overall premise of the paper is correct, but it's interesting that they repeatedly reference lists of what they call "problematic" right wing categories such as “IDW,” “Alt-right,” “Alt-lite”, “AntiSJW”, “Conspiracy”, “MRA”, “ReligiousConservative”, “QAnon”, and “WhiteIdentitarian” while they seem to only recognize a single category as extremist left content: the "Socialist" category.

If you're specifically looking out for a long list of right wing extremist content categories, but only one category of left wing extremist content is there any wonder that you'd find that youtube pushes people to extremist right wing stuff to a greater extent than they do the extremely limited left wing extremist content being considered?


No shit, Sherlock


The study considers the following "Very Left" [1]:

- MSNBC

- Senator Bernie Sanders

- Elizabeth Warren

- Vox

I mean, I suppose it is understandable if your political experience is solely American. But I do wonder if one considers these "very left", what will happen if they come across political concepts such as Anarchism? If they read Malatesta's writings, for example, would their minds just explode?

[1]: https://www.pnas.org/doi/10.1073/pnas.2213020120#supplementa...


> In this study, the research team defined problematic channels as those that shared videos promoting extremist ideas, such as white nationalism, the alt-right, QAnon and other conspiracy theories. More than 36% of all sock puppet users in the experiment received video recommendations from problematic channels. For centrist and left-leaning users, that number was 32%. For the most right-leaning accounts, that number was 40%.

They defined problematic channels as anything specifically espousing far-right wing ideas, and found that ring-wing users were only-slightly more likely to be recommended content from them.

It's kind of disappointing they couldn't find something problematic or conspiratorial from the left, even just for the sake of comparison.


It must be election season. Like clockwork, the pro-censorship hit pieces start rolling out


> For right-leaning users, video recommendations are more likely to come from channels that share political extremism, conspiracy theories and otherwise problematic content. Recommendations for left-leaning users on YouTube were markedly fewer, researchers said.

This depends on the researcher's definitions of 'extremism' and 'conspiracy theories'.

- Recently we've seen many left wing people state that disassembling people in front of their families - surely an 'extreme' act - is a 'beautiful act of resistance', and that calls for genocide against Jewish people (surely also 'extreme') may not constitute hate speech in some contexts.

- For the last 7 years we've had many people believe in the Russiagate conspiracy theory.

- I'm not sure "problematic" has any real meaning.


How do I get my YouTube feed to do this? All I get is a stream of mentally leftist incoherent nonsense.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: