Hacker Newsnew | past | comments | ask | show | jobs | submitlogin
Subscriptions Drive Views of Extremist Videos on YouTube (dartmouth.edu)
16 points by alephnerd on Nov 3, 2024 | hide | past | favorite | 13 comments


Effectively disabling dislikes is what has led to this mess in the first place.

If your algorithm takes takes the like and dislike ratio into account for what videos are shown then people are encouraged to create content that is not offensive to the majority of people. There is a natural self-policing and extreme takes get less visibility.

If you optimize for engagement, well we all know that spreading hate and general negativity generates the most engagement. So content creators will focus on that kind of content. There is no way for the moderate majority of the user base to keep them in check.


Dislikes are not disabled, just the count is not visible. You can still see the ratio in your creator studio.

The algorithm is optimising for view time, I think that's far more of a variable than Dislikes, and people will dislike content but still watch a significant portion of it.


Yeah, that is what I meant when I said it is optimizing for engagement i.e. view time. Early youtube algo did take dilikes into accound and tried to optimize for quality. Dislikes are effectively disabled.

A high dislike count isn't something content creators need to worry about, hence the more extrem content.


It is something they need to worry about. It’s just less effective for brigading videos.

Except now those people over-focus on the chrome extension that records the like:dislike ratio of videos (of extension users only) and they end up filtering themselves by thinking it’s authoritative for the overall view population.


People voluntarily watching their favorite stuff that occasionally offends yyour sensibilities. Perish the thought!

The situation on the ground is that YouTube suppresses a lot of harmless videos and channels under the pretense that it is "extreme" stuff. They also demonetize anything they like and you have no recourse as an uploader, unless your channel is making them insane money and they want to keep you at all costs. What good would subscriptions be for if they did not increase the odds of you seeing sub videos in your feed? Stop trying to tell people they shouldn't watch stuff you disagree with.


The academy's fear of 'harmful' words has become more dangerous than the words themselves.


> Given the challenges of trying to characterize the content of every single video viewed, the researchers focused on the type of YouTube channels people watched. They compiled lists of channels that had been identified as alternative or extreme by journalists and academics and then examined how often a participant visited videos from those channels.

> Alternative channels included content for men’s rights activists, anti-social justice warriors, and intellectual dark web material; extreme channels included white supremacist, alt-right, and extremist material.

Eh...

Relying on "journalists and academics disliking a channel enough to claim it's extreme," but then not bothering to actually go look at the details of the material watched, sounds pretty lazy to me if you're trying to make this sort of argument.

Or to at least sample a reasonable cross section (perhaps videos that showed up in a lot of the monitored profiles) and make sure they're generally consistent with what the claimed nature of the channel is.


Yeah. Given that half of America votes for a party that many people inside the party would describe as 'anti-social justice' it's a pretty hard to guess at what is being regarded as harmful.

The Intellectual Dark web includes stuff like 'Quillette' which most people would have no problem with.

The bit at the start :

"In 2019, YouTube announced that changes to its algorithms had reduced watch time of harmful content by 50%, with a 70% decline in watch time by nonsubscribers."

Seems like a very good result for YouTube. They are funneling people away from politics unless you go out and seek it.

Most people are presumably far more interested in their niche hobby YouTube videos anyway rather than politics which is often a contest of who is less awful.


> The Intellectual Dark web includes stuff like 'Quillette' which most people would have no problem with.

Just checking Wikipedia:

> Quillette has published articles supporting the "human biodiversity movement" (HBM),which attempts to reintroduce ideas from eugenics and scientific racism into the mainstream.

> HBM refers to beliefs that human behaviors are impacted by inherited genes, and certain predispositions are unique to certain ethnic groups

https://en.wikipedia.org/wiki/Quillette

I sure hope most people would have a problem with that crap.


You sir, are throwing the baby out with the bath water.


They have thousands of articles written by a range of authors.

How many of "that crap" is enough to discard the entire publication as extreme?

Quillette stands out as a place where you can read about almost anything. You may disagree with articles - good! That's worth something these days.


You think it is good that people have a platform where they can spread racism?

I am German, I have visited the concentration camps. You can not tolerate people like this. They can not have any right to speak. After all they are not going to tolerate mine once they gain power. People wanting to murder me is not a matter of opinion.

You might think I am dramatic but that is because you might be (for now) sheltered from the consequences, asylum seekers having their accommodations burned down, trans people being murdered, black people being shot by the police, women dying because doctors are too scared of anti-abortion laws are not so lucky.


I'm American. I prefer to err on the side of free speech, versus over-aggressive censoring of what can be written/thought/etc.

Quite honestly, I've never seen anything "racist" on Quillette, and I've read them for some while in the past. There are certainly a wide range of positions held by the assorted authors, and it's one of few sites that hadn't (at least a few years ago when I was reading it more regularly) fallen to "We cannot so much as think anything that does not align to our political orthodoxy."

"A Wikipedia article stating that a site at some point had some articles on an topic many find offensive" is not enough for me to classify a platform as a racist platform.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: