Stories
Slash Boxes
Comments

SoylentNews is people

posted by Snow on Friday March 15 2019, @06:14PM   Printer-friendly
from the nielson-smielson-ratings-mean-nothing-except-to-a-reality-tv-show-president dept.

YouTube Recommendations for 'Alt-Right' Videos have Dropped Dramatically, Study Shows:

Google has made "major changes" to its recommendations system on YouTube that have reduced the amount of "alt-right" videos recommended to users, according to a study led by Nicolas Suzor, an associate professor at Queensland University of Technology.

During the first two weeks of February, alt-right videos appeared in YouTube's "Up Next" recommendations sidebar 7.8 percent of the time (roughly one in 13). From Feb. 15 onward, that number dropped to 0.4 percent (roughly one in 250).

Suzor's study took random samples of 3.6 million videos, and used 81 channels listed on a recent study by Rebecca Lewis [.pdf] as a starting point. That list includes voices like Richard Spencer, an American white supremacist, but also includes more mainstream voices like Joe Rogan, who does not self-identify as alt-right but often plays host to more extremist voices on his podcast (including alt-right figures such as Alex Jones).

The drop appears significant, but it's difficult to figure precisely how that drop occurred. We don't know if YouTube is targeting 'alt-right' videos specifically or if the drop off is part of broader changes to YouTube's recommendation system.


Original Submission

 
This discussion has been archived. No new comments can be posted.
Display Options Threshold/Breakthrough Mark All as Read Mark All as Unread
The Fine Print: The following comments are owned by whoever posted them. We are not responsible for them in any way.
  • (Score: 5, Insightful) by Immerman on Friday March 15 2019, @07:45PM (2 children)

    by Immerman (3985) on Friday March 15 2019, @07:45PM (#815004)

    Sorry, I must disagree. The entire concept of a recommendation engine is to steer what people watch. The "neutral" form tries to steer them towards things they're more likely to sit through the ads for. A more socially responsible version will also try to steer them away from provably false content claiming to be true. And let's be honest - while a lot of "alt-right" content is legitimate difference of opinion, there's also an awful lot of of it that involves active deception, through either intent or ignorance.

    Starting Score:    1  point
    Moderation   +3  
       Insightful=3, Total=3
    Extra 'Insightful' Modifier   0  
    Karma-Bonus Modifier   +1  

    Total Score:   5  
  • (Score: 1, Offtopic) by Phoenix666 on Saturday March 16 2019, @04:15AM (1 child)

    by Phoenix666 (552) on Saturday March 16 2019, @04:15AM (#815276) Journal

    there's also an awful lot of of it that involves active deception, through either intent or ignorance.

    You mean like what the MSM did to the Covington kids, and are now getting sued for hundreds of millions of dollars for? You mean like promulgating Jussie Smollett's racist hoax and whipping up more racial division?

    Pot calling kettle black.

    I don't need Big Brother, anyone's Big Brother, pre-sifting all information for me to decide what's good for me and what isn't, what's "true" and what isn't. First, I'm not a child and can judge for myself. Second, there is no absolute objective definition of what is "true" and what isn't. What one person gets triggered by is another person's sensible statement.

    Seriously, does nobody read Bradbury or Vonnegut anymore? Are we about to see firefighters ransacking houses and burning books because some self-righteous prig considers them wrongthink?

    It is deeply depressing that so many are ready to discard the Western tradition of free inquiry and critical thought to chase a fleeting illusion of acceptability.

    --
    Washington DC delenda est.
    • (Score: 3, Insightful) by Immerman on Saturday March 16 2019, @07:22PM

      by Immerman (3985) on Saturday March 16 2019, @07:22PM (#815550)

      What you seem to be missing, and I'm beginning to believe it's willful, is that the recommendation engine is *already* sifting through all that information for you. If you use it at all, you're voluntarily submitting yourself to that.

      And unfortunately, if your only sifting criteria is "maximum viewer engagement", then you're going to tend to go off the rails very quickly, because most people are extremely bad at rational thought, and easily engaged and deceived by their pre-existing biases.