Stories
Slash Boxes
Comments

SoylentNews is people

SoylentNews is powered by your submissions, so send in your scoop. Only 17 submissions in the queue.
posted by mrpg on Monday May 22 2017, @10:01PM   Printer-friendly
from the +1-interesting dept.

The Guardian has posted a number of documents in connection with their investigation of Facebook's policies:

Facebook's secret rules and guidelines for deciding what its 2 billion users can post on the site are revealed for the first time in a Guardian investigation that will fuel the global debate about the role and ethics of the social media giant.

The Guardian has seen more than 100 internal training manuals, spreadsheets and flowcharts that give unprecedented insight into the blueprints Facebook has used to moderate issues such as violence, hate speech, terrorism, pornography, racism and self-harm. There are even guidelines on match-fixing and cannibalism.

The Facebook Files give the first view of the codes and rules formulated by the site, which is under huge political pressure in Europe and the US.

They illustrate difficulties faced by executives scrabbling to react to new challenges such as "revenge porn" – and the challenges for moderators, who say they are overwhelmed by the volume of work, which means they often have "just 10 seconds" to make a decision. "Facebook cannot keep control of its content," said one source. "It has grown too big, too quickly."

Many moderators are said to have concerns about the inconsistency and peculiar nature of some of the policies. Those on sexual content, for example, are said to be the most complex and confusing.

Here's a reaction from a UK child safety charity:

Asked for a response to Facebook's moderation guidelines, a spokesperson for the UK's National Society for the Prevention of Cruelty to Children described the rules as "alarming" and called for independent regulation of the platform's moderation policies — backed up with fines for non-compliance.

"This insight into Facebook's rules on moderating content is alarming to say the least," the spokesperson told us. "There is much more Facebook can do to protect children on their site. Facebook, and other social media companies, need to be independently regulated and fined when they fail to keep children safe."

See also: EU audiovisual reform will create a nanny state (Estonian MEP opinion)


Original Submission

 
This discussion has been archived. No new comments can be posted.
Display Options Threshold/Breakthrough Mark All as Read Mark All as Unread
The Fine Print: The following comments are owned by whoever posted them. We are not responsible for them in any way.
  • (Score: 2) by nobu_the_bard on Tuesday May 23 2017, @12:57PM

    by nobu_the_bard (6373) on Tuesday May 23 2017, @12:57PM (#514188)

    Of course the guidelines are insufficient. They represent a subset of American values. People from other countries have different values!

    Most of the guidelines, as far as I've read of them, I don't see how I could have done better and still have them enforceable. I wouldn't want to be in the position to have to censor more than the minimum, either. It's also probably in their best interests, as a platform that benefits from a wide audience, to allow as many types of content and users as possible; they aren't really incentivized to censor.

    I think if they haven't already, they need to start thinking about some kind of geo-filtering, or else accept they're going to have to have the most restrictive "lowest common denominator" rules possible or face being blocked in some countries.

    Starting Score:    1  point
    Karma-Bonus Modifier   +1  

    Total Score:   2