Stories
Slash Boxes
Comments

SoylentNews is people

posted by mrpg on Monday May 22 2017, @10:01PM   Printer-friendly
from the +1-interesting dept.

The Guardian has posted a number of documents in connection with their investigation of Facebook's policies:

Facebook's secret rules and guidelines for deciding what its 2 billion users can post on the site are revealed for the first time in a Guardian investigation that will fuel the global debate about the role and ethics of the social media giant.

The Guardian has seen more than 100 internal training manuals, spreadsheets and flowcharts that give unprecedented insight into the blueprints Facebook has used to moderate issues such as violence, hate speech, terrorism, pornography, racism and self-harm. There are even guidelines on match-fixing and cannibalism.

The Facebook Files give the first view of the codes and rules formulated by the site, which is under huge political pressure in Europe and the US.

They illustrate difficulties faced by executives scrabbling to react to new challenges such as "revenge porn" – and the challenges for moderators, who say they are overwhelmed by the volume of work, which means they often have "just 10 seconds" to make a decision. "Facebook cannot keep control of its content," said one source. "It has grown too big, too quickly."

Many moderators are said to have concerns about the inconsistency and peculiar nature of some of the policies. Those on sexual content, for example, are said to be the most complex and confusing.

Here's a reaction from a UK child safety charity:

Asked for a response to Facebook's moderation guidelines, a spokesperson for the UK's National Society for the Prevention of Cruelty to Children described the rules as "alarming" and called for independent regulation of the platform's moderation policies — backed up with fines for non-compliance.

"This insight into Facebook's rules on moderating content is alarming to say the least," the spokesperson told us. "There is much more Facebook can do to protect children on their site. Facebook, and other social media companies, need to be independently regulated and fined when they fail to keep children safe."

See also: EU audiovisual reform will create a nanny state (Estonian MEP opinion)


Original Submission

 
This discussion has been archived. No new comments can be posted.
Display Options Threshold/Breakthrough Mark All as Read Mark All as Unread
The Fine Print: The following comments are owned by whoever posted them. We are not responsible for them in any way.
  • (Score: 0) by Anonymous Coward on Tuesday May 23 2017, @02:04AM

    by Anonymous Coward on Tuesday May 23 2017, @02:04AM (#513913)

    It's explicit and clear when you have a clear cultural frame of reference. If you took me and showed me a picture, I would be able to tell you whether it would be lewd and need to be deleted in the town I grew up in. Within a thousand miles from here there will be places that have a more conservative attitude (not politics) about what's lewd, and there will be places that are more permissive.

    Facebook's problem is its trying to model the idea of lewdness on a global scale. In essence, they're disregarding diversity and trying to make everybody happy at once. That's why breast cancer awareness images get banned until there's a media uproar. (Seems a media uproar is the only thing that can call Facebook's attention to a problem.)