Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

I was raised on tech. A third generation computer user, started writing software at seven under my father’s guidance. A Luddite I am not, but this doesn’t bode well for our future. YouTube is basically an indoctrination engine for white nationalism. It’s more or less what the right claims the American higher education system is for the left, only there’s no conspiracy fantasy to it.

https://www.theregister.com/2022/10/18/youtube_algorithm_con...



The actual findings, as reported in the very link you post:

> "We found that YouTube's recommendation algorithm does not lead the vast majority of users down extremist rabbit holes, although it does push users into increasingly narrow ideological ranges of content in what we might call evidence of a (very) mild ideological echo chamber," the academics disclosed in a report for the Brookings Institution.

> "We also find that, on average, the YouTube recommendation algorithm pulls users slightly to the right of the political spectrum, which we believe is a novel finding."

So, about as close to being an "indoctrination engine for white nationalism" as a librarian that recommends books you like. And I am saying this as someone who reads Jacobin and watches any interview with Chomsky I can find.


“Pulls users slightly to the right”

So the indoctrination isn’t obvious? So it’s subtle? That makes it more pernicious, in my eyes.

I never said anything about a vast majority. To indoctrinate doesn’t mean convert an entire population, or even a percentage thereof. On the contrary, it refers to a process of teaching a person or group (of any size) to accept a set of beliefs uncritically. It doesn’t specify what degree of beliefs have to change, nor how rapidly, or severely.

YT recommends Fox, Shapiro, et al to kids watching anime, to adults whose sole interests are cat videos and programming tutorials. A bit different than a librarian suggesting books one might like.

And what happens if the librarian sees I’ve been checking out the likes of Mein Kampf, and makes recommendations based on that? Does indoctrination through multiple channels cancel itself out, or some? I’m not sure what point you’re trying to make there but it sounds a lot like “bad things can happen in other places so it’s acceptable if YouTube does bad things too.

One should consider the effect on those already radicalized in addition to the indoctrination of the non-radicalized when seeking to understand the political ramifications of such bias in algorithms. It’s not like they exist in a vacuum, after all.

edit: Add to that, on the topic of librarians, the decentralized nature of libraries and librarians ensures any effect of a single librarian will be limited to a local area. Don’t think we can say the same for YouTube algorithms.


My point is this: recommending related videos is not indoctrination, even if the content is political. If I'm watching Shapiro and YT recommends Fox, this is not indoctrination (same as, if I'm watching Young Turks and YT recommends Majority Report, it's not indoctrinating me).

Now, if I'm watching Anime and YT recommends Shapiro, I can agree that's closer to indoctrination. However, if it only happens like 2 times for every 10M watches of anime, and then 1 time for every 10M it's recommending Young Turks, then it's not really a significant force in this area; and it is only pushing slightly to the right - and I believe this is the sort of thing that the study found. So coming back to your first quote:

> So the indoctrination isn’t obvious? So it’s subtle? That makes it more pernicious, in my eyes.

No, that is not what the study found. It found that political recommendations for right-leaning content are slightly more common than those for left-leaning content.


I don’t need to reword the findings to make them support my assertion.

Again those findings: “We also find that, *on average*, the YouTube recommendation algorithm pulls users slightly to the right of the political spectrum”.

The whole “on average” nullifies the notion that occasionally recommending Young Turks to kids watching Anime once in a while somehow makes up for the fact that they push OANN or Newsmax even harder. That’s like saying I took one step forward so you should ignore the two steps I took backward.

Also you are ignoring the implications further down the line. If YouTube pulls neutral to the right, then it likely pushes those already right even further in that direction.

Are you familiar with the concept of network effect?

> So the indoctrination isn’t obvious? So it’s subtle? That makes it more pernicious, in my eyes.

>> No, that is not what the study found

“In my eyes” isn’t analogous to “that’s what the study found”, FYI.


> The whole “on average” nullifies your assertion that they recommend Young Turks to kids watching Anime as much as they do OANN or Newsmax.

I didn't say that they do it "as much", I specifically suggested they may do it half as often. But, per the study, they DO do it - otherwise, this would not have been a "slight" bias, it would have been a whopping huge bias.


That was a poor edit on my part. My apologies.

What I meant to say was the assertion that occasionally recommending YoungTurks somehow mitigates the right-leaning bias of the site, as suggested with the statement “then it's not really a significant force in this area” is false. The site has a demonstrable rightwing bias.

Elections can and are decided by a few thousand or few hundred votes in battlegrounds. As such, the argument that it’s of negligible effect rings false to me.


Even on a much smaller scale, the algorithm is incentivised to radicalise you. A few years ago I would watch videos of helicopters with my 4-5 year old son because he loved helicopters and enjoyed watching them lift things, cut trees, put our fires, etc, etc.

Then the suggested videos started including helicopter crash compilations and he was super keen to see those and lost interest in the more "vanilla" helicopter videos. That was the end of that avenue of entertainment and it's only now he's 11-12 that he's getting some limited access to youtube again.


Wikipedia also documents it:

https://en.wikipedia.org/wiki/Alt-right_pipeline

Or if someone rather likes the article format:

https://bpr.berkeley.edu/2022/10/31/addressing-the-alt-right...




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: