Source: Technology Review
A new study [PDF] suggests what we've suspected for years is right: YouTube is a pipeline for extremism and hate.
How do we know that? More than 330,000 videos on nearly 350 YouTube channels were analyzed and manually classified according to a system designed by the Anti-Defamation League. They were labeled as either media (or what we think of as factual news), "alt-lite" intellectual dark web, or alt-right.
[...] The alt-right is what's traditionally associated with white supremacy, pushing for a white ethnostate. Those who affiliate with the "intellectual dark web" justify white supremacy on the basis of eugenics and "race science." Members of the alt-lite purport to not support white supremacy, though they believe in conspiracy theories about "replacement" by minority groups.
[...] The study's authors hypothesized that the alt-lite and intellectual dark web often serve as a gateway to more extreme, far-right ideologies. So they tested that by tracing the authors of 72 million comments on about two million videos between May and July of last year. The results were worrying. More than 26% of people who commented on alt-lite videos tended to drift over to alt-right videos and subsequently comment there.
[...] The team, from the Swiss Federal Institute of Technology Lausanne, also found evidence that the overlap between alt-righters and others who dabble in intellectual dark web and alt-lite material is growing. The authors estimate that about 60,000 people who commented on alt-lite or intellectual dark web content got exposed to alt-right videos over a period of about 18 months. The work was presented at the 2020 Conference on Fairness, Accountability, and Transparency in Barcelona this week.
In a statement, YouTube said it's working through these issues: "Over the past few years ... We changed our search and discovery algorithms to ensure more authoritative content is surfaced and labeled prominently in search results and recommendations and begun reducing recommendations of borderline content and videos that could misinform users in harmful ways."
A spokesperson added that YouTube disputes the methodology and that it doesn't take into account more recent updates to its hate speech policy or recommendations. "We strongly disagree with the methodology, data and, most importantly, the conclusions made in this new research," the spokesperson said.
(Score: 1, Interesting) by Ethanol-fueled on Friday January 31 2020, @06:27PM (2 children)
Because the people who design these algorithms are arrogant Shylocks who think the human cattle utilizing their algos are too stupid to figure it out, and won't at all be insulted by having their preferences chosen for them by the chosen.
Nowadays YouTube appears to be 50% a propaganda arm of the Democrat/Globalist party and 50% behavioral experiment to see how many hoops you will jump through to find the video that used to be at the top of your search results. This is why people should just get a YouTube downloader and download all the vids they like to be conveniently watched anytime with or without an internet connection.
The people who run YouTube could decide overnight that it is "no longer profitable" and just pull the whole thing, leaving you high and dry. Take what you can from it and bail, they don't respect you and you should not respect them by contributing to their experiments.
(Score: 4, Insightful) by ikanreed on Friday January 31 2020, @07:02PM
Even as an arrogant shylock, I feel comfortable telling you to fuck off ya racist dipship.
(Score: 2) by hendrikboom on Friday February 07 2020, @04:04AM
Like the people now in charge of YouTube did to google+.