Stories
Slash Boxes
Comments

SoylentNews is people

posted by janrinok on Friday January 31 2020, @04:46PM   Printer-friendly
from the picking-holes dept.

Source: Technology Review

A new study [PDF] suggests what we've suspected for years is right: YouTube is a pipeline for extremism and hate.

How do we know that? More than 330,000 videos on nearly 350 YouTube channels were analyzed and manually classified according to a system designed by the Anti-Defamation League. They were labeled as either media (or what we think of as factual news), "alt-lite" intellectual dark web, or alt-right.

[...] The alt-right is what's traditionally associated with white supremacy, pushing for a white ethnostate. Those who affiliate with the "intellectual dark web" justify white supremacy on the basis of eugenics and "race science." Members of the alt-lite purport to not support white supremacy, though they believe in conspiracy theories about "replacement" by minority groups.

[...] The study's authors hypothesized that the alt-lite and intellectual dark web often serve as a gateway to more extreme, far-right ideologies. So they tested that by tracing the authors of 72 million comments on about two million videos between May and July of last year. The results were worrying. More than 26% of people who commented on alt-lite videos tended to drift over to alt-right videos and subsequently comment there.

[...] The team, from the Swiss Federal Institute of Technology Lausanne, also found evidence that the overlap between alt-righters and others who dabble in intellectual dark web and alt-lite material is growing. The authors estimate that about 60,000 people who commented on alt-lite or intellectual dark web content got exposed to alt-right videos over a period of about 18 months. The work was presented at the 2020 Conference on Fairness, Accountability, and Transparency in Barcelona this week.

In a statement, YouTube said it's working through these issues: "Over the past few years ... We changed our search and discovery algorithms to ensure more authoritative content is surfaced and labeled prominently in search results and recommendations and begun reducing recommendations of borderline content and videos that could misinform users in harmful ways."

A spokesperson added that YouTube disputes the methodology and that it doesn't take into account more recent updates to its hate speech policy or recommendations. "We strongly disagree with the methodology, data and, most importantly, the conclusions made in this new research," the spokesperson said.


Original Submission

 
This discussion has been archived. No new comments can be posted.
Display Options Threshold/Breakthrough Mark All as Read Mark All as Unread
The Fine Print: The following comments are owned by whoever posted them. We are not responsible for them in any way.
  • (Score: 1, Interesting) by Anonymous Coward on Friday January 31 2020, @05:51PM (4 children)

    by Anonymous Coward on Friday January 31 2020, @05:51PM (#951853)

    I'm a 50-something discontent man, and the algorithm doesn't give me any of those guys as recommendations. All I get is repeats from the few subscriptions I have, and mind-numbing amounts of anime and synthpop videos. (You get curious once and they think that's all you like...)

    Oh, and when really bored I do watch a lot of the "dumb drivers causing wrecks" videos, but I never get any of them as recommendations. Curious, that. Perhaps it's because with my ad-blocker I never see any ads, so the holy mystical algorithm is punishing me.

    Starting Score:    0  points
    Moderation   +1  
       Interesting=1, Total=1
    Extra 'Interesting' Modifier   0  

    Total Score:   1  
  • (Score: 0) by Anonymous Coward on Friday January 31 2020, @06:30PM (2 children)

    by Anonymous Coward on Friday January 31 2020, @06:30PM (#951883)

    You need to create bookmarks for all of the channels you like. You can't even trust the subscriptions feature to work properly anymore. Also make sure to look for secondary channels, Twitter profiles, BitChute backup, etc.

    You could try clearing your browser or account history to reset recommendations, and browse anime and music videos in incognito so they don't poison your recommendations.

    • (Score: 0) by Anonymous Coward on Friday January 31 2020, @07:01PM (1 child)

      by Anonymous Coward on Friday January 31 2020, @07:01PM (#951897)

      what is wrong with you goddamn slaves recommending more closed source shit instead of open source, decentralized solutions? just how goddamned stupid are you fucking sheep?

      • (Score: 0) by Anonymous Coward on Friday January 31 2020, @07:17PM

        by Anonymous Coward on Friday January 31 2020, @07:17PM (#951904)

        99% of the eyeballs are on YouTube. If you aren't banned from there, of course you target that first. You decentralize by spreading across multiple platforms and linking them together. You only use the 0.01% open source, decentralized platforms as a last resort.

  • (Score: 0) by Anonymous Coward on Saturday February 01 2020, @10:00AM

    by Anonymous Coward on Saturday February 01 2020, @10:00AM (#952282)

    Oh, and when really bored I do watch a lot of the "dumb drivers causing wrecks" videos, but I never get any of them as recommendations. Curious, that. Perhaps it's because with my ad-blocker I never see any ads, so the holy mystical algorithm is punishing me.

    I mostly have music videos playing. It's bloody annoying how many "dumb drivers causing wrecks" videos I get because I watched one once. That and "news bloopers".
    I can search for "80's Rock" and I'll get 1/3 of each, "News Bloopers", "Dumb Drivers", "80's rock".
    Why are they pushing the shit so hard?