Stories
Slash Boxes
Comments

SoylentNews is people

posted by cmn32480 on Sunday December 18 2016, @10:22PM   Printer-friendly
from the peeling-the-onion dept.

Facebook has detailed its plan to deal with fake news appearing on the platform. It involves labeling false information with a link to a fact-checking site, as well as warning users when they attempt to repost these flagged items and giving them a worse position in the news feed:

Facebook has struggled for months over whether it should crack down on false news stories and hoaxes that are being spread on its site. Now, it has finally come to a decision. The social network is going to partner with the Poynter International Fact-Checking Network, which includes groups such as Snopes and the Associated Press, to evaluate articles flagged by Facebook users. If those articles do not pass the smell test for the fact-checkers, Facebook will label that evaluation whenever they are posted or shared, along with a link to the organization that debunked the story. Many of the organizations said that they're not getting paid for this.

"We have a responsibility to reduce the spread of fake news on our platform," Adam Mosseri, Facebook vice president of product development, told The Washington Post. Mosseri said the social network still wants to be a place where people with all kinds of opinions can express themselves but has no interest in being the arbiter of what's true and what's not for its 1 billion users.

The new system will work like this: If a story on Facebook is patently false — saying that a celebrity is dead when they are still alive, for example — then users will see a notice that the story has been disputed or debunked. People who try to share stories that have been found false will also see an alert before they post. Flagged stories will appear lower in the news feed than unflagged stories. Users will also be able to report potentially false stories to Facebook or send messages directly to the person posting a questionable article.

The Pew Research Center also released a survey about fake news, finding that a majority of Americans believe that fake news has caused confusion about the basic facts of current events.


Original Submission

 
This discussion has been archived. No new comments can be posted.
Display Options Threshold/Breakthrough Mark All as Read Mark All as Unread
The Fine Print: The following comments are owned by whoever posted them. We are not responsible for them in any way.
  • (Score: 3, Insightful) by FlatPepsi on Sunday December 18 2016, @11:17PM

    by FlatPepsi (3546) on Sunday December 18 2016, @11:17PM (#442858)

    "if you ask amateurs to act as front-line security personnel, you shouldn't be surprised when you get amateur security." (1)

    You're asking the masses to screen news articles, I would expect low-quality, inconsistent results. Depending on the day of the week, you may see CNN (correctly) marked as a fake news site, or the next day "The Onion" as reliable - if it's a funny article.

    I have low expectations.

    (1): https://www.schneier.com/blog/archives/2010/05/if_you_see_some_1.html [schneier.com]

    Starting Score:    1  point
    Moderation   +2  
       Insightful=2, Total=2
    Extra 'Insightful' Modifier   0  

    Total Score:   3  
  • (Score: 0) by Anonymous Coward on Sunday December 18 2016, @11:22PM

    by Anonymous Coward on Sunday December 18 2016, @11:22PM (#442860)

    You're asking the masses to screen news articles, I would expect low-quality, inconsistent results

    One could argue the same about Wikipedia, and sometimes they do let through a whopper. However, many people agree that Wikipedia is a very useful service.

    • (Score: 1, Informative) by Anonymous Coward on Monday December 19 2016, @01:30AM

      by Anonymous Coward on Monday December 19 2016, @01:30AM (#442889)

      However, many people agree that Wikipedia is a very useful service.

      Well, yes and no. There is a lot of useful information on Wiki; in fact, I often do use it in my work as a research scientist. But...I would be rather sceptical of anything that cites Wiki as a reference on some hot-button social or political issue. (I'm sure everyone who reads SN is well aware of the editor wars that frequently erupt over pages with such contentious content.) At the very least, I would want to see one or two additional independent references for anything on Wiki that is contentious in nature before I would even begin to take it seriously. As with much of the rest of life, Caveat Emptor applies to Wikipedia.

  • (Score: 2) by takyon on Sunday December 18 2016, @11:27PM

    by takyon (881) <reversethis-{gro ... s} {ta} {noykat}> on Sunday December 18 2016, @11:27PM (#442861) Journal

    Most of the plan does not involve using the masses.

    I assume that Facebook wants to match a steady stream of fact checking from Snopes, Poytner, or whatever makes the whitelist, with the incoming articles. Using machine learning to figure out which news feed items should have an appended link to a related fact checking article.

    --
    [SIG] 10/28/2017: Soylent Upgrade v14 [soylentnews.org]
  • (Score: 3, Interesting) by stormwyrm on Sunday December 18 2016, @11:58PM

    by stormwyrm (717) on Sunday December 18 2016, @11:58PM (#442865) Journal
    My understanding of even the summary is it won't be made to work that way. Ordinary people can report news links they think might be fake but the system only uses that as one input, perhaps as a mark for further scrutiny by professional fact checkers before something gets flagged as possibly false. The real problem here is as always: quis custodiet ipsos custodes? How far can we really trust these professional fact checkers?
    --
    Numquam ponenda est pluralitas sine necessitate.
    • (Score: 0) by Anonymous Coward on Monday December 19 2016, @02:08AM

      by Anonymous Coward on Monday December 19 2016, @02:08AM (#442902)

      > How far can we really trust these professional fact checkers?

      Make your decision based on how they handle their mistakes.
      Do they double-down and pretend everything is fine?
      Or do they make a public retraction?

      If the later, then that's about as good as you can reasonably expect.