Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

In this thread, in social media, and in recent articles that critique YouTube's recommendations, it seems like the anecdotal examples all try to support a claim that YouTube's recommendation algorithm radicalizes viewers towards the far right side of the US political spectrum. This to me seems like a naive, self-serving perspective channeled by those who are on the US political left. Somehow, the same algorithm might help channel viewers to content from say Alexandria Ocasio-Cortez, but no complaints are made in that instance, since that content aligns with the views of those who are complaining.

These algorithms are based on what people watch based on prior history. As outsiders, we can't know exactly how it works, but it seems like it would be some mix of the video you are currently watching and other videos in your watch/search history, which then feed into a model that considers all users' behaviors and produces probabilistic relationships between videos. This is a fairly standard way for how recommendations work, and it is reasonable. As others have pointed out, there are also many anecdotes of YouTube recommendations working great - for example with discovery of music. But we should keep in mind that the same recommendations engine supports those same positive anecdotes.

For a lot of users, the recommendations some are labeling as incorrect, radicalizing, etcetera are actually the recommendations they want and enjoy. Why should other users or YouTube intervene if that's what user behavior supports? Clearly, the reason is that these other users want to constrain access to views they don't share - which is just another way of saying they support censorship when it applies to others. There is a dangerous concept creep in Google's blog post:

> We’ll continue that work this year, including taking a closer look at how we can reduce the spread of content that comes close to—but doesn’t quite cross the line of—violating our Community Guidelines.

This is really the first step in expanding their guidelines, and if they cave to either internal political activism or the complaints coming from the outside (a group with very uniform views aligning with the political left), it is a risk for public discourse. This is in part because there are very few alternatives to Google in general, and that applies to YouTube as well. As others have pointed out, BitChute was blackballed by payment processors. Any constraint on free exchange of information on YouTube is dangerous because so much of online discourse is under Google's control, on their platforms.

I can't help but think that the BuzzFeed article and others are basically cherry-picked hitjobs to create a false narrative that something more nefarious is going on beyond YouTube simply respecting user behaviors. The impetus to say something is wrong here seems to be based entirely on personal political biases. For example, some people here are bemoaning the fact that watching Joe Rogan's podcasts (which are less political) lead you to watch Jordan Peterson or Ben Shapiro. If that's the behavior other users who watch a Joe Rogan video are exhibiting, I don't see anything wrong with that recommendation. And if you dislike it, consider that this might be the exposure to different views that people often pay lip service to.

Some of the other annoyances outlined by commenters are less related to specific content and more like general usability concerns. A summary:

- It is no longer possible to see recommendations based on just the current video. I agree this is poor design. It seems at some point YouTube switched the recommendations feed from being centered on whatever content you are currently seeking (which the currently playing video would be an input for), to just showing recommendations based on everything you've ever done. I think there is a place for the current set of recommendations but it should not be the sole set. Having a "Based on this video" section of recommendations and a "More for you" section of recommendations would solve this problem.

- User feedback saying you don't want to watch something is not immediately respected. Other users say that it is respected, if you provide the feedback multiple times. The latter may be appropriate, since a single dislike is likely best interpreted not as an absolute, but a signal that gets ingested into the algorithm and weighted appropriately.

- People want to be able to watch a video and not have it feed into their recommendations. This is currently possible by pausing watch history or search history, but that's deep in the settings and much harder to find. I agree, YOuTube should make a privacy mode toggle that is available right next to the video.

- Sometimes videos that were already watched are recommended. I'm split on this one - I've certainly seen times they recommend a previously-watched video where I don't think it would make sense to rewatch it. But that is likely based on user behavior as well. And if there are enough users watching some video repeatedly, it might be a fair and correct prediction to suggest it to me. When you click the 'Not Interested' link beneath a video, you can also click 'Tell Us Why' and select the option 'I already watched it' to give them that hint. But the interface is a bit cumbersome, certainly.



Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: