Hacker News new | past | comments | ask | show | jobs | submit login

> Sorry, can I just interrupt you there for a second? Just let me be clear: You’re saying that there is no rabbit hole effect on YouTube?

> I’m trying to describe to you the nature of the problem. So what I’m saying is that when a video is watched, you will see a number of videos that are then recommended. Some of those videos might have the perception of skewing in one direction or, you know, call it more extreme. There are other videos that skew in the opposite direction.

I assisted with data research on YouTube recommended videos (https://www.buzzfeednews.com/article/carolineodonovan/down-y...), and this claim is misleading at best. There may be some videos that aren't extreme, but that doesn't matter if they are a) in the extreme minority and/or b) appear very far down in the queue of 20+ recommended videos when users typically either do "Up Next" or look at/action only the top few recommendations.

This isn't a "both sides" issue.




I have an unlisted video with about 15 views that when viewed in an incognito tab constantly recommends stuff like "Ben Shapiro owns liberal professor", alongside gun videos.

I just did an experiment where I clicked one of these and clicked "next" a bunch going through their recommended videos. It immediately dove into alt-right stuff, and eventually led to videos of antifa protestors getting beaten up.

This is normal on YouTube.


The same thing happens to me as well. I don't know who Ben Shapiro is besides knowing he's "not liberal" and have never watched his videos but they are recommended in the sidebar of almost any video I watch.

I looked up the hip-hop artist Mac Miller when he died of a drug overdose and the all of the recommended videos were of his pop star ex-girlfriend being the one responsible for his death. Clearly the engine has learned to go towards the extreme.

My opinion is to push for education in all aspects of these algorithms and keep your kids off Youtube! It is so far from network or cable TV it's a little scary. That said, it's an amazing resource but it's geared toward growth and profitability at any cost (most likely lives).


Seriously, why the fuck is this? I get recommended Ben Shapiro, Jordan Peterson, MGTOW, etc. all the fucking time despite explicitly telling Youtube that I'm "not interested" in these videos.


This is normal, true, but be aware even your starting point gives clues. Your IP address, your browser choice, the fact it’s (probably) the first time YouTube sees you... it all feeds into the algorithm.

Might be the case that Chrome users in your area are going Incognito to watch Ben Shapiro videos. Given your reaction, that might be exactly the case.

Or it might be Ben is a B-level celebrity that might attract a click to display an expensive ad.


The normality of it is why it's so awful.

Maybe it's true people matching some of my info watch are likely to watch Shapiro videos, but by optimizing for engagement they basically are trying to herd similar people towards whatever version of that person watches the most YouTube videos.

Indirectly YouTube wants everyone to be a right-wing fanatic, because the algorithm has determined that that sort of person watches a lot of videos.

YouTube shows videos that are likely to get clicks, but fanatics are more likely to click and watch videos. So things fanatics care about are promoted. YouTube seems to take this stance that it's just natural social dynamics at play, excusing themselves from addressing what they're enabling.


I wonder how much of this is due to sponsorship. Aren't Ben Shapiro, PragerU, etc. billionaire-funded outreach programs? Can you pay Youtube to give your videos higher positioning in recommendation lists without their higher positions being explicitly labeled as "sponsored"?


No, they are not billionaire funded. It is more likely to be a result of PragerU their lawsuit against youtube and YouTube adjusting their algorithm to give them a bit of a boost


https://en.wikipedia.org/wiki/PragerU#Funding

I'm not sure if Ben Shapiro is being directly funded but it looks like PragerU is mostly funded either directly by billionairs or by think tanks (which are often also funded by billionaires but I don't have time to dig into all of the different ones mentioned here)


[flagged]


[flagged]


I may be confused here, but the person you’re responding to is saying that Ben Shapiro does not advocate for a white ethno state. Are you saying that he does?


[flagged]


The more concerning thing is that you're an intelligent person: you're likely smart enough to know that tweet is a weird mix of general right wing feelings that would describe most Republicans and bizarre leaps of logic.

You also probably know (or have the knowledge to find out) that Ben Shapiro had publicly stood against the alt right on numerous occasions.

Yet you're presenting this as a "verifiable fact" regardless.


> It's simply a verifiable description of Shapiro's views.

A description that's verifiable insofar as the quality of sources that the author cited. And the author provided sources in the form of urls on an image, and most people don't click on those. I'll link them up here, and let the readers decide whether this unverified twitter account's description holds up:

1. https://www.dailywire.com/news/21144/complete-transcript-ben...

2. https://www.youtube.com/watch?v=aVQke0HkLiU

3. Author doesn't actually provide the source, we just have to take the author's word that the video is as described.

4. https://twitter.com/benshapiro/status/156246995978293248

5. https://www.nationalreview.com/2017/01/health-care-markets-g...

6. https://slate.com/news-and-politics/2018/01/is-ben-shapiro-a...

7. https://www.newsweek.com/ben-shapiro-democrats-are-love-taxi...

8. https://twitter.com/benshapiro/status/25712847277

9. http://www.freeman.org/m_online/sep03/transfer.html

10. https://www.dailywire.com/news/31980/media-are-lying-about-t...

Even just taking the time to look down and compare the statements made on the image and the quotes that were used to justify these statements revealed the kind of artistic license that was taken in the interpretation of the sources at hand.


Shapiro has made many videos against the alt right. He does not advocate for a white ethno state.

As other posters have pointed out there's some laughably extreme logical twisting - "advocates for Muslim concentration camps" - going on in your tweet there.


>Antifa are pretty explicitly ('by any means necessary', carrying baseball bats, beating people up) a violent group

Provably untrue. Stop it.


I really don't know about antifa, but most of what I have heard or seen support this view (the Berkeley bike lock incident, "punch a nazi" campaign, the video with the origin of the "repent zoomer" meme and the new antifa fighting club).


> provably untrue. So prove it. No, they look for and start the fights. This is just one case, true, but it makes your case hard to prove: https://www.foxnews.com/us/dc-antifa-leader-is-third-charged...


I don't think this is a fair summary of what the Product Chief is saying. When people talk about the YouTube rabbit hole, they're talking about watching video A, which links to B, to C, etc. until they reach content considered extreme.

This is simplistic, but say all videos are on a 1-10 integer scale in terms of politics. YouTube always recommends 80% videos that are on the same value as the currently viewed video and 10% are one value higher and 10% are on one value lower. It doesn't push anyone in either direction. Randomly clicking on recommended videos will randomly move you up or down (and usually not move at all). But, no matter where you begin, you can always reach either end in 4 clicks - cue people talking about the YouTube rabbit hole.

But hey, maybe it's not enough to merely prefer content that isn't extreme, what if we wanted to actively push people to the center? We could recommend 80% videos that are one value higher or lower towards the center spectrum (or the same is the viewed video is at 5 or 6), and 10% videos that are at the same value, and 10% towards one value away from the center. Randomly clicking on videos _will_ bring most viewers to the center in this system.

But we can still reach the extremes in just 4 clicks. This is why complaining that users can go down a rabbit hole feels disingenuous to me. They always will be _able_ to do so - unless YouTube never recommends videos that aren't more centrist than the video being viewed.

I suppose YouTube could just ban all videos that are at 1 and 10 in our spectrum, but that's basically just calling for censorship. And not to mention, 2 and 9 will just become the new definition of extreme.

Edit: many of the comment lead me to think that I haven't made what is and isn't in scope of my comment clear. Allow me to point out two things in particular.

1. YouTube already does prohibit hate speech. How well it enforces its rules is a valid point of discussion, but not what I'm talking about here. I'm pointing out that the ability to get to the extremes of what _is_ allowed on the platform through recommendations should not in and of itself be a point of concern.

2. I'm also aware that sometimes the recommendation are very messed up in context of the current video - e.g. shock videos getting recommended to kids. But this is more about trolls deliberately gaming the recommendation algorithm, which I see as a distinct issue from the alleged rabbit hole pattern of recommendations.


The alternative is that Youtube could choose NOT to recommend videos for large categories of videos, particularly those related to kids or other categories (for example, Rohingya-related videos in Myanmar in the current time period). You need not censor the videos, however you can choose to make their discovery less easy.

There's no law that says they have to recommend videos. There is however sufficient evidence at this time around the rabbit-hole effect causing real-world harm. This isn't 2005. It's about time Youtube, and social media properties adopt the medical principle: "first, do no harm". Perhaps they will do this willingly, but somehow I doubt it.

To wit: https://www.bellingcat.com/news/americas/2018/10/11/memes-in...


You're right that YouTube is not obligated by law to recommend videos. They do however follow their internal guidelines, which try to keep the platform open to ideas. Some of those ideas are distasteful to the majority of the employees at YouTube, however they will still in general respect people's rights to present those ideas.

Allowing monetization (where much of YouTube's controversies have been) is a completely different story since it also involves the demands of advertisers. The advertisers do have a right to avoid paying money to extremist content, just like you have a right not to donate to causes you don't believe in.

> first, do no harm

Easier said than done.

Challenge one is to define "harm" without classifying all sorts of alternative political ideas as "harmful".

Challenge two is once you've defined "harm", build a system that works at YouTube scale that can actually filter it. Also consider that it's a moving target - people are able to figure out how to get around YouTube's filtering algorithm, so YouTube needs to constantly update it to keep ahead of these people.


Recommendations, and autoplaying videos in particular based on them, clearly drive growth and profits. But they are clearly the crux of the problem.

Sure, defining harm is not straightforward. But it's possible, and frankly as a private platform, it is their choice.

One could, for example, take a broader definition of harm, and use that to limit recommendations, but still host the video and allow it to return in search results. Is this the right answer? I don't know. But it's silly to just throw up your hands and somehow take recommendations as some sacrosanct part of the system, when it was clearly a choice to have them to begin with.


I'm not sure how kids factor into this, I've always heard the YouTube rabbit hole in the context of pushing people to either ends of the political spectrum.

As far as "do no harm" goes, implementing safeguard against the alleged rabbit hole has a very strong potential to do harm by making YouTube a partisan platform. Bear in mind that concern over the alleged rabbit hole is not equal between liberals, centrists, and conservatives. Many of the recommended videos I've seen people reference as evidence of the alleged rabbit hole are videos I'd considered mainstream. What it takes to placate the critics has very strong potential to violate the principle of "first, do no harm".


I would suggest looking through that previous link regarding examples of where one can quickly end up in white nationalist content through recommendations. Again, recommendations are for Youtube's own growth-related ends. However, it is emphatically a choice to have them.

Regarding kids, this has been very well documented: https://www.ted.com/talks/james_bridle_the_nightmare_videos_...

https://medium.com/@jamesbridle/something-is-wrong-on-the-in...


Your medium post is not directly related to what I was referring to in my first comment. Yes, bad actors game the recommendation algorithms to troll people. That's what Elsagate was about. People deliberately making channels and videos that would appear as and presumably get recommended as kid friendly content when it was actually shock content. I'm well aware that recommendation algorithms can be abused. Perhaps I should have more strongly emphasized that my examples are presented in a very simplified scenario, that is limited to the alleged political rabbit hole and not things like Elsagate.

The point I am making is that pointing out the fact that it's possible to get to the extremes of YouTube by clicking through on recommended videos shouldn't be surprising. In fact, of you couldn't so that it would be evidence of deliberately trying to keep certain content from being popular.

The question of how well YouTube polices it's terms of service (which already does prohibit hate speech) is related to this question about the alleged rabbit hole because such content presumably exists at the edge of my hypothetical political spectrum values. But it's not directly related to the fact that clicking through on recommended links can bring people to 1 and 10 even if they start at 5 should not be surprising.


What you're saying is true, people often seek extremes in all things without any extra nudging, but the problem is that YouTuve deliberately creates several other positive feedback loops. It doesn't just show related videos, it shows popular related videos and additionally selects them based on what you watched before.

It makes sense to show more Lego content on the sidebar of a Lego set review. It's less clear that I should be chased by Lego videos across the site after I clicked on a single one of them.

Also, considering that various extremes tend to attract attention, it would make sense to use an "anti-viral" algorithm that de-preoritizes videos that get too many views at the moment. This would have an additional effect of giving better exposure to less popular channels, promoting different content creators.

Realistically, I don't think they will do either, even if they know for sure that it will help, because it will reduce addict... I mean, engagement of their user-base.

Jaron Lanier talked about these issues a lot recently. He makes some interesting points regarding incentives and feedback loops.


Wouldn't this all be easy to prove or disprove using random walks? Just run a thousand or so, from different starting points, and see where they all go, picking recommendations truly at random (without bias towards more extreme stuff).

If Youtube truly pulls you towards extreme videos, then you would always end up at those videos after a long run.


I don't think you have to pick recommendations at random. Just turn on autoplay and see where it takes you.


Perhaps this positive feedback loop is true - but I haven't seen evidence of it. In fact, what quantitative evidence I have seen seems to indicate the opposite: https://www.thepostmillennial.com/does-youtube-facilitate-ri...

Key findings:

* Centrist content links to liberal content at over twice the rate it links to conservative content.

* Proportionally more conservative content links to centrist or liberal content than either liberal or centrist content links to conservative content.

* Liberal content links to itself at the highest proportional rate.

Granted, who knows what portion of liberal or conservative content counts as extreme. But concerns over YouTube extremism almost always has to do with conservative extremism and the data does not suggest conservative content is being promoted by YouTube.


Not sure how this relates.

When I say "extreme" I mean "extreme" in all senses, not just politically extremist. Could be just an extremely one-sided view of something ("why X sucks") or even a "top 10" or "worst fail" video.


> When I say "extreme" I mean "extreme" in all senses, not just politically extremist.

I don't consider your opinion invalid, but I do get the sense that the majority of concern over the YouTube "rabbit hole" is about promoting politically extreme content. My comments, since the beginning, have been specific to the allegation of pushing users to the extremes politically. I tried had to be explicit about this, and even edited my comment (prior to your comment, mind you) to more explicitly spell out the fact that I'm not talking about extreme things that aren't political (e.g. the "Elsagate" issues of shock content getting recommended to kids).

If this is the issue you're taking, then bear in mind that it is not in scope of what I am pointing out, and I don't think it's on scope for the majority of people's discussions about the "YouTube rabbit hole".


This is a little off topic, but I watched a Jordan Peterson video once and now my recommendations are fairly extremely right-wing. I'm center-left and the views Peterson espouses aren't politically extreme (although I've heard some are pretty bizarre) at 'worst' he is a moderate conservative (I say "worst" only because at the current cultural moment "conservative = bad" is the prevailing meme). So I'm not sure exactly how YouTube works, but it feels to me to be trying very hard to push me in a particular direction. To be clear, I don't think YouTube has a RW political agenda, but perhaps it has an economic interest in pushing people to the fringes (maybe people at the extremes click more ads or something). Or perhaps they have a left-wing agenda and they want me to associate moderate-RW views with extremist views (seems unlikely, but then again lots of prominent media companies are pushing a similar agenda).


> interest in pushing people to the fringes (maybe people at the extremes click more ads or something).

This theme has been explored in quite many articles on news and advertisement. It is not that extreme stories get people to click more on ads, but it keep people engaged for longer and looking at more content. More views, more engagement with the publishing site, and more ad impressions.

I recall reading that there exist an interesting link between moral outrage and dopamine highs. It basically becomes a drug which publishers see as increased user engagement.


> To be clear, I don't think YouTube has a RW political agenda, but perhaps it has an economic interest in pushing people to the fringes (maybe people at the extremes click more ads or something).

I think it's simpler than what you're suggesting. They want to make money, and the indirect way they do that is to get people to watch more videos. If most people who watch Jordan Peterson tend to also watch extreme right wing content, and you watch his video, then you will get recommended right wing content, because they think that's what you will want to watch.

I work for Google but not on Youtube, opinions are my own.


I considered that, but I can't think of any plausible reason for prominent overlap in viewership of JP's moderate political content and extreme RW content. Maybe Peterson appeals to people who previously thought the only moderate option was quiet deference to progressives? Would be really interesting if this were the case--it might imply that Peterson has a strong moderating effect on RWers and/or that people are being taught a false dichotomy between extreme left and right.


> I considered that, but I can't think of any plausible reason for prominent overlap in viewership of JP's moderate political content and extreme RW content

From my understanding, JP is actually pretty connected with conservative/right wing politics. Not so much that he is necessarily right wing himself, I wouldn't know, but that he has a large right wing following.

I think they like him for his opposition to political correctness/gender pronouns. His philosophy on masculinity also really appeals to conservatives rebelling against feminism.

See: https://en.wikipedia.org/wiki/Jordan_Peterson#Academia_and_p... https://en.wikipedia.org/wiki/Jordan_Peterson#Gender_relatio...


It's a recommendation engine. People who like JP also like X.

And almost anyone who likes any alt-right video also likes JP, because he provides a veil of respectability for anti-feminist, anti-multicultural views.


That 1-10 seems impossible to gauge correctly without just basing it on the profiles of a video's previous viewers.

And the crux seems to be that YouTube has a single primary interest in generating suggestions: to offer the viewer a video that they probably want to see next. Netflix recs are no different.

So, the impolite or gruesome content becomes clickbait for an ad revenue machine.


> But we can still reach the extremes in just 4 clicks. This is why complaining that users can go down a rabbit hole feels disingenuous to me. They always will be _able_ to do so - unless YouTube never recommends videos that aren't more centrist than the video being viewed.

I generally agree, but maybe an important detail orbits around the "can" (of This is why complaining that users can go down a rabbit hole...): are Youtube recommendations a healthy mix that give users a choice or does the majority of recommendations push the user down the rabbit hole?


I do suspect that a lot of the alleged rabbit hole comes from people deliberately picking out recommended videos that are more extreme than the currently viewed one. But I'm interested in putting this theory to the test.

I'm actually thinking of making a script or an extension that automates, say 10, 25, or 50 random clicks in the top 10 recommendations and seeing where it ends up. Maybe this system exists already.


I would love to see such an analysis, hehe :))

I have to admit that the exact definition of the problem is not easy to define:

e.g. if I have on my screen a video about "people-that-hate-bananas", would it be ok to have recommendations that show only e.g. "why-bananas-are-bad" or should I (ethically) get as well videos of the type "are-bananas-really-bad?" respectively "these-are-good-bananas!"?

Using the above example I guess that if Youtube (& similar sites) show only recommendations based on case #1 (which is eaaasy for IT) then yes here I go down the rabbit hole (reinforces/confirms the concept "bananas-are-bad") even if the recommendations are not worse than the original one => therefore even recommendations on "equal" level might have to be evaluated at least partially negatively? (as in this example the recommendations don't promote alternative point-of-views / alternative thinking / search for a flow or compromise, if you understand what I mean)

In any case the evaluation of the results you get might be very hard to do, if you want to do it automatically to understand the meaning (e.g. very hard to understand if "good-bananas-are-bad" is negative or positive towards bananas)... :P (maybe if you manage to you might become a Nobel candidate, hehe, or maybe I'm underestimating the current power of AI, which is definitely possible)


I think that the claim you quoted is actually the less interesting claim in the article, the more interesting one is this:

> ...It’s equally — depending on a user’s behavior — likely that you could have started on a more extreme video and actually moved in the other direction.

> That’s what our research showed when we were looking at this more closely..

Now this is also a bit of a weasely claim. First because of "depending on user's behaviour" which isn't very clear (but for now I'll give the benefit of the doubt and assume they are trying to say "unless the user is actively seeking radical content"). Second because this claim could be true and YT would still be radicalizing (i.e. it could go 1->3, 3->2, 2->4, 4->3, 3->5, 5->4 in terms of extremeism giving a gradual slope towards radical while still maintaing that you are equally likely to navigate to more extreme content as you are to navigate to less extreme content).

OTOH, if I understand the article you linked to correctly, all it says it that if you keep clicking "Up Next" you'll eventually get from mainstream content to extreme content, which doesn't invalidate the above YT claim if their research shows that users aren't actually doing that. Given that you assisted with that research maybe you could make it more clear what you tested and how it shows the "rabbit hole" effect?


Honestly both of these are pretty easy claims to check. Not by manually looking and choosing recommended videos, but by following a random walk. If it is true that it all drags you into extreme videos, then if you take a random walk from any video, choosing recommendations at random, you would always end up down the hole in extreme videos.

In a lot of these experiments though, people explicitly pick the more extreme of the recommendations repeatedly, which of course will lead you down the rabbit hole.


The Buzzfeed article quotes the SPLC, a partisan organisation that has labelled anti-FGM compaigners and Muslim reform groups as 'hate' groups (and only admitted the mistake under threat of legal action).

https://www.washingtonpost.com/opinions/the-southern-poverty...


If "fucked up majorly once" means you should be disregarded, Marc Thiessen of "torture is legal and effective" fame should go on your disregard list too.


There's two examples in the comment you're replying to, and you can find more in less time it took you to write that reply.




Consider applying for YC's Spring batch! Applications are open till Feb 11.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: