Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

>by Alec's excellent video which I recommend everyone watch.

I get what your advice is about but to add some nuance which didn't cover... you should consider that I learned of Alec's Technology Connections channel 9 years ago because the Youtube algorithm suggested it to me.

Why did Youtube do that? It was because I had watched Ben's Applied Science excellent video showing vinyl grooves under an electron microscope: https://www.youtube.com/watch?v=GuCdsyCWmt8

So the first Alec video I got exposed to was his related topic on vinyl records (click "Oldest" to see them) : https://www.youtube.com/@TechnologyConnections/videos

I'd argue that the Youtube algorithm is very good at finding adjacent videos of interest especially in educational topics and DIY repair tutorials.

You're suggesting people go to Youtube subscriptions feeds but people have a list of favorites in their subscriptions often because of the algorithm. There's a bit of chicken-vs-egg situation going on there.

What a good algorithm does is help users with the Explore-vs-Exploit tradeoff: https://en.wikipedia.org/wiki/Exploration%E2%80%93exploitati...

- Explore --> Youtube algorithm sidebar recommendations of related videos.

- Exploit --> add a worthy creator to subscription feed and get alerted to new releases from that person

The "explore" part is helped by algorithms because they can suggest videos you would have never thought of because you don't know the keywords or jargon to type into a Youtube search box to get to it directly. "You don't know what you don't know."

But don't use the algorithm for politics or click on anything that has a thumbnail with the shocked Pikachu face. That just starts a feedback loop of crap.

Arguably, the algorithms could put one into a non-productive engagement loop never to escape. Personally, I don't think it's a big risk for educational/DIY topics because your brain gets saturated with "too much information" and hits a stopping point where you don't want to learn any more.

So... Algorithms can be bad ... but you can also make them work for you.




I agree with you almost completely. I never used YT as a content source until a few years ago - I’d never open the app and only watch videos linked or embedded / looking up a specific how to video. Now it’s different though.

I never go to my subscription feed - the front page algo keeps me up to date on any new content from people I want to see updates on. I’ve noticed too it almost has a “shadow subscription” where even though I am not subscribed to certain channels, it knows I watch every video by them so it gets on my front page too.

The front page really has a “vibe” that follows my interests around. Watch a few too many Minecraft videos or car repair and soon you start seeing more and more of the front page being those topics. Get a new interest in pyramids? Devlogs? Nature? The front page slowly decays old interests and promotes new ones.

Which is again why I don’t check my sub feed - it’s a graveyard of interests, many of which I don’t care about right now. The algo surfaces the ones I do.


> Watch a few too many Minecraft videos or car repair and soon you start seeing more and more

In my experience it's "watch one video outside of your recommendations and then half your next set of recommendations will be related to that". I'm scared to click on anything I'm not already subscribed to for fear of trashing the home page.


You can (almost) always tell you're not interested in those videos and it slowly stops suggesting them. You can also ask it to never recommend a certain channel.


You can’t, however stop it from suggestion huge blocks of shorts or crappy free to play games.


You somehow must be able to, as everyone I know has feeds full of that sort of thing, and I've never seen even one short or game on my youtube home page.


Nope. You specifically can't disable shorts, no matter how much you want to - so much that YouTube Revanced has this feature.


> You specifically can't disable shorts

You can use the three dots to say "Not Interested" on the Shorts shelf but it only hides it for 30 days and then the insidious little worm comes right back.


Yes you can, with something like uBlock Origin, or even a simple stylesheet override.


If you have to go outside of youtube you modify its behavior, that doesn't really qualify as "you can make it stop". That's like saying you can make someone stop shooting you by wearing a Kevlar vest.


What a silly comparison.


Seems perfectly apt to me. Javascript/CSS are also no solution at all on mobile devices.


One is pixels on a screen and the other is bullets in your torso. Can you explain how that's apt?

But yeah, I do apologize for trying to offer solutions, as they are not perfect.

(Pretty sure uBlock works mobile too, but that's irrelevant)


Sadly not available when using the Apple TV to watch YouTube. If there's a NextDNS / PiHole method that stops YouTube ads, I could probably implement that...


I tried this for games and shorts, but something about their HTML changes enough that they keep reappearing. I absolutely despise shorts


Yeah they tweak the HTML enough that static stylesheet overrides can become outdated real quick. Very annoying. It's been a while since I looked, but there probably is some uBlock Origin filter list that'll handle this and will stay fairly up-to-date.


You can delete things from your watch history to clean up the algo. If I ever let a TV auto play for too long or have kids over watching things I go though and delete things I think will negatively affect the algo.

I feel like clicking a video and immediately clicking off is also a negative signal they use but YMMV.


My interests vary so widely that my home feed is awful. I like watching videos on power tools, software development, video games, sports highlights, math, and other, less focused things like the hoard of cat video channels I sub to to keep existential dread at bay.

I get recommended right leaning videos and videos with ads for manscaped and I'm neither a conservative or a man. It's super weird so I tend to separate my interests into two apps: the YouTube web app for "junk food content" and FreeTube when I want to learn and focus. It's the only way I've found to not be fed the random content carrots while falling down the rabbit hole.


I manage to keep a pretty diverse set of topics.

Right now my homepage seems to be

- construction/DIY videos (Perkins, B1M, Megaprojects, Matt Risinger, NS Builders)

- video game dev (blackthornprod)

- "indie game of the day" channels (Aliensrock, Nialus)

- military videos (Battleship New Jersey, Ryan McBeth)

- freerunning / urban exploration (STORRER)

- movie & tv analysis / commentary (Frame Voyager, Corridor Crew, New Rockstars)

- chess (agagmotor, Magnus)

- Minecraft (Mumbo Jumbo)

- random documentaries (fern, Stewert Hicks, Half as Interesting)

- egypt / pyramids (History for GRANITE)

- science / engineering (Adam Savage, Colin Fruze, Applied Engineering)

- coding (Tsoding)

From just a quick scan of the topics / channels.


Youtube disrespects any person that is multilingual. They shove AI translations into our faces because the content creator is responsible for disabling it not the user. Instead of German videos I get German videos with English AI audio. I keep getting videos from the same game I always ignore or say not interested on my front page.

Youtube wants my money. They will never get my money when they come up with things like that. I will give them my money once they start cracking down on ads. And by that I mean actual moderated ads - not random ads with porn. As long as they serve scam ads I will never give them money - and it does not look like I will in my lifetime.


Paying them money removes all ads though, scammy or not. And they just announced their lower priced tier, although I think not all genres are ad-free.


Ad blockers do the same and will be my choice. It is their choice to not moderate their main business. It is their choice to disable monetization on history, ..., people who say fuck in the first minute.

Is it that hard to look at all the BS and say - no not my money?


But most of the channels I'm interested in have "in video" ads (ads put there by the creator rather than Youtube, I don't know the industry terminology) that remain even if you pay for Youtube Premium.

I wish Premium provided some way for me to predict how many of these kinds of ads are in a video I am considering watching (e.g., by requiring the creator to tell YT how many there are and imposing consequences on creators that lie) but YT does not.


Those are sponser segments and I have no problem with them. These videos can be expensive to produce and these people need to support themselves still. If you take away or reduce their ad revenue, how do we expect them to survive?

There is an extension called SponserBlock that will automatically skip them and other marketing / time wastes. Otherwise I just skip ahead (the highest peak in the watch graph playbar at will usually be the end of the segment)


>If you take away or reduce their ad revenue, how do we expect them to survive?

I want YT to pay them out of what I pay for YT Premium.


They do, and its a pittance.


Where are you getting this information? Are you a YT creator?


No, just public sources and seeing others review their income. Which part? YouTube shares details in [0] for how Premium earnings works, I believe the current split is 55% to creators. This is like Spotify where your payment is proportional to your eligible watch time over all watch time from all premium-eligible videos across all premium accounts. YT Premium Lite was announced here [1].

[0] https://support.google.com/youtube/answer/6306276

[1] https://blog.youtube/news-and-events/introducing-premium-lit...


In all honesty I had no idea YouTube had a sub feed until you mentioned it. I still don't know how to access it, but the home page more than suffices for me.


Yeah, pro/anti "algorithms" is too reductive, especially since the old status-quo was also an algorithm of people and processes.

I'd rather use a lens more like all the open-source/free-software concerns about controlling your own computer:

1. Can I see how the recommendation algorithm is intended to work? The site-owner says it works for my benefit, but what if they're mistaken, or lying?

2. What has it recorded about my interests, and how can I fix bad records that don't represent them?

3. When it's not working well--or harmfully exploiting my baser weaknesses--how can I change to a different one?


Kind of comes down to one of Neil Postman's questions

"Whose problem is it that it solves?"

It's possible to get some benefit from an algorithm/process, just as a side effect, that was never designed to work in your interest and is an opaque cloud service. Maybe the service is solving the network owner's problem of selling you to advertisers. If you want to maximise for "interest and relevance to my life goals" there's nothing to stop you running your own "algorithm" of course, except any obstacles put in your way by the data network owner. For that reason it's more important to pay attention to the freedom of the network (open API, federated, maximally distributed etc) than the algorithms that run on it. If you control the former you control the latter. HN (the network) seems to allow a lot from the plethora of viewers I've seen.


This seems like the perfect place to once again recommend "Amusing Ourselves To Death" :-)

I also read "Technopoly" recently, and while it didn't have quite the same impact on me, I can't deny that it accurately describes the techno-political moment we're currently living in. Well worth the time.


I started using Youtube as a frequent content source in the past year. I've been aggressively curating my recommendations by clicking "Not interested" on anything I don't want to see a lot of. If I'm curious I'll watch the one video but in Incognito. If a channel gets repeatedly re-recommended I don't hesitate to use "Don't recommend channel". I also +like everything I've watched and subscribe to channels I'll more likely watch than not watch new content. Getting recommendations pruned feels like getting to a zero email Inbox.


Unfortunately, it doesn't really work like that. If you say "Don't recommend channel", the channel may in fact be recommended and similar channels will definitely still be recommended. "Not interested" works even less accurately.


IME, Not interested at least prevents that single video from showing in my feed. It may reduce similar content from same or other channels--I see some reductions. Don't recommend channel always works for me as far as I noticed, except in search results which seems correct. I never expected it to not recommend similar channels, e.g. I filtered out Linus Tech Tips but watch similar content.

It probably helps that I only permit a handful of specific topics: physics, fun math, synthesizers (but not modular), tiny bit of music theory/training, StarCraft 2 (not SC1/BW), and recently the Nvidia/AMD GPU release saga.


> Personally, I don't think it's a big risk for educational/DIY topics because your brain gets saturated with "too much information" and hits a stopping point where you don't want to learn any more.

This may also be an artifact of the fact that you are the sort of person who seeks out educational content. I.e. you have a high need for intellectual stimulation. That makes you an outlier among all people who use social media.

Personally I think technical people underestimate the negative impacts of the models that drive the algorithms. We are basically training humans via a reward function that maximizes watch time. We are also heavily correlating errors in knowledge because popular stuff gets boosted so much. Correlated errors are bad for rubustness.


YouTube seems less of a social media site than Facebook and Twitter to me. But maybe that's because I mostly use it for educational content also. I want a good recommendation engine, but I don't care what videos my friends and neighbors are watching.


> Arguably, the algorithms could put one into a non-productive engagement loop never to escape. Personally, I don't think it's a big risk for educational/DIY topics because your brain gets saturated with "too much information" and hits a stopping point where you don't want to learn any more.

In my personal experience, "edutainment" can certainly be addictive, and more often than not, consuming it is "unproductive" because (1) consuming content aimlessly is intrinsically mostly passive, (2) passive consumption is ineffective for retaininig knowledge or building understanding, (3) content is often superficially interesting because of a spectacular and/or highly simplified presentation.

This is only a counterpoint to the idea that educational content is limited in its potential to be addictive/unproductive; there is still, obviously, a great positive potential to high-quality educational content.


You make a valid point about the benefits of algorithms in content discovery. While they can enhance our experience by introducing relevant material, it's crucial to maintain control over our consumption to avoid potential echo chambers.

On a related note, for those looking to access diverse content without regional restrictions, reliable proxy services can be invaluable. NodeMaven offers high-quality proxies that ensure secure and unrestricted browsing. I can drop a link for everyone as Iir realy helped me during mu thesis https://nodemaven.com/proxies/residential-proxies/.


One point to observe here is that there’s a difference between a “related content” section when viewing specific content and the more general algorithmic feed that is designed to be the primary mode of discovery and interaction.


Thanks, this is an important nuance. Recommendation algorithms are absolutely useful, and if you're so inclined you can absolutely make them work for you, but this is about making educated, conscious decisions about what you click next in your 'Related videos' section.

Algorithmic feeds don't give us that opportunity - they're designed to require minimal effort and to keep the dopamine coming without any conscious decisions.


RIYL: Tom Scott's There Is No Algorithm For Truth [1] delivered to the Royal Society in October 2019.

1. https://www.youtube.com/watch?v=leX541Dr2rU


I dislike any politics or clickbait too and it doesn’t come back.

I have no complaints about my Instagram and YouTube feeds. They give good recommendations.


My only complaint is that it doesn't encourage enough innate curiosity. I am disincentivized from clicking on some things precisely because their recommendation engine is so tightly wound that I'll get bombarded with really tangential content for days or weeks after. Let me click a few videos about a topic before assuming I'd been hiding a lifetime passion for '80's John Deere combine trans-axle repairs that I need itched almost exclusively to within an inch of its life, all because I thought that guy in the thumbnail might get squished, and didn't adhere to my better angels. Besides that, it's pretty, pretty neat. Not great not bad. I can recommend farm machinery repair YouTube--it is a vibrant community of delightful weirdos and incredibly smart folk who talk about software more than you'd think.


You're basically training their advertising model for them, so yay for the Algorithm, I guess.


> I dislike any politics

TikTok in particular sneaks politics into everything. Even if it's not explicitly political.

I asked Deepseek once to walk me through what it knows about TikTok and it claimed the Chinese version uses an RL approach to sprinkle socialist core values into your feed even if you explicitly don't want politics. It also claimed TikTok absolutely promises it doesn't do this in the US. I'm not really convinced Deepseek knows what it's talking about but it was pretty plausible technically.

But in practice it's easy to tell if someone even in the US spends a lot of time on TikTok base on their strongly held opinions even when they explicitly say they never watch political content.

I doubt other social media companies do this because they aren't created specifically for political propaganda like TikTok is, but it's possible they do.


good, they should sprinkle socialist values in. what's the other option, passive acceptance of capitalist exploitation? people need to be educated.


You may want to do some basic background reading on what socialism actually is.

People in the US tend to think it refers to things that are actually mixed economies or primarily market economies with strong social guarantees. Think things like the Nordic model or the European model.

Mixed economies with social welfare guarantees are mainstream economics. Actual socialism in the style practiced in reality by countries implementing socialism is mainly characterized by the absence of human rights (including zero worker rights), mass murder, poverty, and a ruling elite that are functionally oligarchs who have enslaved the rest of the population. And on top of this, all socialist states that I'm aware of have re-introduced markets in some form but retaining the dictatorship structure.

Socialism in the past (e.g. in the 18th century) has referred to other ideas, but it doesn't really anymore.

But even if you were to believe somehow that there's some morally redeemable version of socialism that has somehow just managed to hide all this time, the actual version of socialism embraced by China and promoted on TikTok is fully authoritarian, anti-democratic, and does nothing to improve economic equality in the US.


>People in the US tend to think it refers to things that are actually mixed economies or primarily market economies with strong social guarantees. Think things like the Nordic model or the European model...

Yes. This is what people in the US mean when they say socialism in general conversation. They don't mean pure Socialism as Marx talked about it. Similarly when people say the US is a democracy, we know they don't mean it's an actual pure Democracy where everyone votes on every issue.


Yes with the pedantic caveat that those aren't similar in the sense that one is a common confusion and the other is correct.

The Nordic model is the sort of economy and political structure that the 19th century socialists explicitly reacted against and rejected. It's a representation of mainstream liberal democratic theory not of socialism.

On the other hand democracy was always understood to be representative because you can't make every decision by plebiscite.

It has to do with being accurate rather than being "pure".


Words evolve with time and usage. If someone said they were gonna send you a mail, would you expect an actual paper letter or an email?


These particular words haven't evolved in any meaningful way in the last century or so. If anything socialism now refers exclusively to revolutionary/militant utopian socialism whereas it had more varied meanings long ago. That's probably largely an artifact of the fact that socialism since the early 1900s has mainly been promoted by socialist countries after a revolution.

Go on a socialist forum and ask them what they think about markets or mixed economies. E.g.

- https://www.reddit.com/r/Socialism_101/comments/w03n2p/what_...

- https://www.reddit.com/r/CapitalismVSocialism/comments/19558...

- https://www.reddit.com/r/Socialism_101/comments/19ewm26/can_...

- https://www.google.com/search?q=reddit+socialism+mixed+econo...

It's harder to search other socialist forums like the darknet ones, but you see the same patterns there


You talking to a member of the DSA buddy. I read about this shit all day. Also, whenever an american uses the word "authoritarian" it is extremely unclear what they mean, usually it means "bad" because they have no principled way to identifying democratic or undemocratic procedures. Most Americans have barely experienced any democracy in their lives at all, just voting every once in a while for canned choices.


> version of socialism embraced by China and promoted on TikTok is fully authoritarian, anti-democratic, and does nothing to improve economic equality in the US

This oversells what China is. China is were government and oligarchy are in a strange symbiosis. Capitalism is worse in China in many ways. You are more free in China to exploit others on a large scale. H1B Visas seems to be an authorian idea to me and something that is heavily exploited in China.

What the US and China have in common is a strange kind of nationalism that I can not define, might be because I come from a smaller country.


You are maybe thinking of something called "Authoritarian Capitalism" https://en.wikipedia.org/wiki/Authoritarian_capitalism

All socialist states have been functionally authoritarian oligarchies in practice. I'm unaware of an exception. Someone once claimed Yugoslavia was, but I'm not familiar enough with it to have an opinion.

It's also confusing because there's the notion of State Capitalism (https://en.wikipedia.org/wiki/State_capitalism#Maoists_and_a...) that I believe Marx also wrote about but I don't remember the details off hand. I think this may more or less be the same idea as authoritarian capitalism, but it's also what socialist regimes eventually look like when they realize they need to re-introduce markets to prevent economic collapse.

My take is that the idea of socialism, communism, and capitalism are deeply muddled in theory and even more muddled in practice. Muddled or not, we have people who consider themselves socialists and followers of Marx, their theory is incoherent, and in practice they try to spread violence and destroy any country they take over. We don't really have people clinging to capitalism in the same way, although some right wing factions in the US tried in the 70s during the cold war to make a capitalistic sort of political religion modeled after the success of socialism.

We'd be better off not having to deal with any of these concepts except for the fact that some people have a strong allegiance to these concepts and there's no changing that in the short term. To me it's a bit like when people killed each other over ideas like how to interpret the tripartite Christian God. Is the son a separate person or the same person, and how does the holy spirit fit in? You get an entire tree of ideas, most of which eventually became heresies punishable by death. None of them made any sense but that didn't stop it from being a motive historical force.


The issue is that algorithms are like a casino for your time.

We all know that gambling addicts exist and how destructive it is to their lives, the casino exploits behaviors and gets all their money. As a result people know casinos are dangerous, reasonable people avoid them, are warned about them, and the government forces regulation to reduce their ability to exploit vulnerable people.

Imagine if none of these controls existed and nobody talked about or generally knew that casinos were dangerous. Imagine if the casinos were 100x better at exploiting you and you were forced to walk through a casino every time you leave your house. You’d get a lot more people having their lives destroyed.

So what this video tries to do is important, naming the term, “algorithmic complacency”, allows it to be recognized, discussed, and actively kept in check by users. Ideally regulated by the government as well, just like casinos.

The casino also provides a service, entertainment, there’s nothing wrong with a reasonable person attending, spending some money and being entertained. But we as a society recognize that a company exploiting behaviors to get all of a person’s money, is bad, and try to limit that negative outcome even though we still allow casinos to exist.

Time, attention, and focus is so abstract people don’t even realize they’re spending it, or how modified their behavior has become because of the algorithm’s exploitation. As a result we let companies who are 100x better at manipulation than casinos operate without so much as mentioning they’re doing it, and steal increasing amounts of a user’s time.


I'd be very skeptical of applying anything we knew about the YouTube algorithm 9 years ago to it, or any algorithm, today.

Google, Facebook, and the other algorithm-driven tech companies have been aggressively enshittifying their products at least since 2020. "I got fun/useful videos out of the YouTube algorithm in 2016" says nothing about what that algorithm is like in 2025, given that they can change it silently on a whim.


In today's ChatGPT age, I'm sure people are attempting to build contextually aware alternatives to feeds?

Something that will filter out the anger, but keep the insight. I vaguely remember someone posting about a tldr for twitter. Anyone know of tools like that?




Consider applying for YC's Fall 2025 batch! Applications are open till Aug 4

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: