Stories
Slash Boxes
Comments

SoylentNews is people

posted by mrpg on Monday May 22 2017, @10:01PM   Printer-friendly
from the +1-interesting dept.

The Guardian has posted a number of documents in connection with their investigation of Facebook's policies:

Facebook's secret rules and guidelines for deciding what its 2 billion users can post on the site are revealed for the first time in a Guardian investigation that will fuel the global debate about the role and ethics of the social media giant.

The Guardian has seen more than 100 internal training manuals, spreadsheets and flowcharts that give unprecedented insight into the blueprints Facebook has used to moderate issues such as violence, hate speech, terrorism, pornography, racism and self-harm. There are even guidelines on match-fixing and cannibalism.

The Facebook Files give the first view of the codes and rules formulated by the site, which is under huge political pressure in Europe and the US.

They illustrate difficulties faced by executives scrabbling to react to new challenges such as "revenge porn" – and the challenges for moderators, who say they are overwhelmed by the volume of work, which means they often have "just 10 seconds" to make a decision. "Facebook cannot keep control of its content," said one source. "It has grown too big, too quickly."

Many moderators are said to have concerns about the inconsistency and peculiar nature of some of the policies. Those on sexual content, for example, are said to be the most complex and confusing.

Here's a reaction from a UK child safety charity:

Asked for a response to Facebook's moderation guidelines, a spokesperson for the UK's National Society for the Prevention of Cruelty to Children described the rules as "alarming" and called for independent regulation of the platform's moderation policies — backed up with fines for non-compliance.

"This insight into Facebook's rules on moderating content is alarming to say the least," the spokesperson told us. "There is much more Facebook can do to protect children on their site. Facebook, and other social media companies, need to be independently regulated and fined when they fail to keep children safe."

See also: EU audiovisual reform will create a nanny state (Estonian MEP opinion)


Original Submission

This discussion has been archived. No new comments can be posted.
Display Options Threshold/Breakthrough Mark All as Read Mark All as Unread
The Fine Print: The following comments are owned by whoever posted them. We are not responsible for them in any way.
(1)
  • (Score: 3, Insightful) by fadrian on Monday May 22 2017, @10:49PM (3 children)

    by fadrian (3194) on Monday May 22 2017, @10:49PM (#513821) Homepage

    Those on sexual content, for example, are said to be the most complex and confusing.

    As in real life. And they're beating up Facebook because they can't consistently model inconsistency? There are a lot of reasons to hate on them, but this is not one of them.

    --
    That is all.
    • (Score: 3, Funny) by bob_super on Monday May 22 2017, @10:58PM

      by bob_super (1357) on Monday May 22 2017, @10:58PM (#513827)

      It's not really confusing.
      For tech guys, it's more like shopping for a fancy new toy: Spend years familiarizing with the specs and your best fit, and as soon as you finally acquire one, it's broken, no returns, and the other models mostly seem like they'd have been better.

    • (Score: 2) by kaszz on Monday May 22 2017, @11:05PM (1 child)

      by kaszz (4211) on Monday May 22 2017, @11:05PM (#513831) Journal

      They must be complex and confusing because that is the only way the message can be told when it can be explicit and clear.

      • (Score: 0) by Anonymous Coward on Tuesday May 23 2017, @02:04AM

        by Anonymous Coward on Tuesday May 23 2017, @02:04AM (#513913)

        It's explicit and clear when you have a clear cultural frame of reference. If you took me and showed me a picture, I would be able to tell you whether it would be lewd and need to be deleted in the town I grew up in. Within a thousand miles from here there will be places that have a more conservative attitude (not politics) about what's lewd, and there will be places that are more permissive.

        Facebook's problem is its trying to model the idea of lewdness on a global scale. In essence, they're disregarding diversity and trying to make everybody happy at once. That's why breast cancer awareness images get banned until there's a media uproar. (Seems a media uproar is the only thing that can call Facebook's attention to a problem.)

  • (Score: 2) by kaszz on Monday May 22 2017, @11:16PM (3 children)

    by kaszz (4211) on Monday May 22 2017, @11:16PM (#513838) Journal

    Seems the UK nanny implementation charity is going wild with this one:

    Alarming! Outlaw platform that don't bend their moderation policies to our will.

    All this alarm and others shall nanny their children.. and adults. The US free speech may be repulsive at times for what it allows. But in the end it wins out because suppressed discussion has a tendency to be worse than bad and visible. The church tried the punishment approach for hundreds of years, with a disastrous outcome.

    And just think how society would be if you had this non-elected corporation meddling with what people say to each other? screams moral superiority "besserwisser" all the way. That is probably where always linked to mainframe microphones with voice interpretation will end.

    • (Score: 0) by Anonymous Coward on Tuesday May 23 2017, @03:28AM (2 children)

      by Anonymous Coward on Tuesday May 23 2017, @03:28AM (#513948)

      The US free speech may be repulsive at times for what it allows.

      No, that's what makes it better. Having more free speech is not merely 'less bad'; it's the only ethical and principled approach, even if it could be guaranteed that censorship of speech you don't like would never result in unintended consequences. We must reject censorship on principle, always and forever; arguments that censorship is a slippery slope are weak defenses of freedom of speech and should serve as secondary arguments only.

      • (Score: 0) by Anonymous Coward on Tuesday May 23 2017, @03:32AM (1 child)

        by Anonymous Coward on Tuesday May 23 2017, @03:32AM (#513949)

        With that said, there is plenty of (unconstitutional) censorship in the US. The FCC mandates some amount of censorship for broadcasters, our authoritarian courts invented the Miller Test, and so on. Most of the population isn't that committed to the principle of freedom of speech and could be convinced to support censorship if something sufficiently 'outrageous' happened, just like they could be convinced to support unconstitutional surveillance again if another large-scale terrorist attack on the level of 9/11 occurred.

        • (Score: 0) by Anonymous Coward on Tuesday May 23 2017, @07:35PM

          by Anonymous Coward on Tuesday May 23 2017, @07:35PM (#514482)

          Fuck, Shit, Piss, Cunt, Asshole, Cocksucker, Motherfucker

  • (Score: 2, Troll) by Arik on Monday May 22 2017, @11:22PM (6 children)

    by Arik (4543) on Monday May 22 2017, @11:22PM (#513840) Journal
    Facebook is cancer.

    --
    If laughter is the best medicine, who are the best doctors?
    • (Score: 0) by Anonymous Coward on Tuesday May 23 2017, @12:25AM (5 children)

      by Anonymous Coward on Tuesday May 23 2017, @12:25AM (#513859)

      Why is this voted insightful? It's not even a good analogy. See if we're using the analogy of a controversial subject as a human disease or ailment, Facebook is not cancer as in you're 'dead' (read bad things happen) guaranteed it's only a matter of time. Since it is possible to walk away from FB it's not even a disease or an ailment. It's an emotional addiction. So perhaps you should use the format

      Facebook is addictive drug reference

      like 'Facebook is heroin' or 'Facebook is Bath Salts' or 'Facebook is Paint Thinner'

      • (Score: 1, Redundant) by Arik on Tuesday May 23 2017, @12:30AM (3 children)

        by Arik (4543) on Tuesday May 23 2017, @12:30AM (#513860) Journal
        No, it's cancer.
        --
        If laughter is the best medicine, who are the best doctors?
        • (Score: 1, Informative) by Anonymous Coward on Tuesday May 23 2017, @01:00AM (2 children)

          by Anonymous Coward on Tuesday May 23 2017, @01:00AM (#513878)

          No, third-wave feminism is cancer. Facebook is like that Oakland, California warehouse fire [theguardian.com]. Dilapidated building, but a rave with lots of people there. Looks like good, fun times. Only to have a bunch of people in a tight space when a fire starts, and only one way out.

          • (Score: 2, Informative) by Arik on Tuesday May 23 2017, @01:30AM (1 child)

            by Arik (4543) on Tuesday May 23 2017, @01:30AM (#513889) Journal
            "No, third-wave feminism is cancer."

            While that's annoying it's more of a niche problem, it doesn't have a very broad appeal and likely never will (it appears to have passed its peak of popularity just in the last year, in fact.)

            BookFace, on the other hand, has (unfortunately) very broad appeal. That is precisely what puts it in a position to be true cancer. It appeals to young and old, rich and poor, black and white - and channels them all together into a fake, controlled, AOL-like version of the web where they can be controlled and exploited at leisure. It turns the free, and open web on its head, like cancer cells replacing healthy ones.
            --
            If laughter is the best medicine, who are the best doctors?
            • (Score: 0) by Anonymous Coward on Tuesday May 23 2017, @08:05AM

              by Anonymous Coward on Tuesday May 23 2017, @08:05AM (#514075)

              While that's annoying it's more of a niche problem, it doesn't have a very broad appeal and likely never will (it appears to have passed its peak of popularity just in the last year, in fact.)

              It has not passed its popularity, it has passed on from the popular news to more mundane and unreported thing called the law. Now you won't hear about it, you will only hear about the increase in rape and "rape-culture" (ugh...) and occasionally you will be asked to contribute your hate towards a rapist who is not getting convicted, until it is you. Then no one will hear you.

      • (Score: 2) by takyon on Tuesday May 23 2017, @02:02AM

        by takyon (881) <takyonNO@SPAMsoylentnews.org> on Tuesday May 23 2017, @02:02AM (#513912) Journal

        Could be a joke based on the recent Daily Mail headline.

        --
        [SIG] 10/28/2017: Soylent Upgrade v14 [soylentnews.org]
  • (Score: 2) by nobu_the_bard on Tuesday May 23 2017, @12:57PM

    by nobu_the_bard (6373) on Tuesday May 23 2017, @12:57PM (#514188)

    Of course the guidelines are insufficient. They represent a subset of American values. People from other countries have different values!

    Most of the guidelines, as far as I've read of them, I don't see how I could have done better and still have them enforceable. I wouldn't want to be in the position to have to censor more than the minimum, either. It's also probably in their best interests, as a platform that benefits from a wide audience, to allow as many types of content and users as possible; they aren't really incentivized to censor.

    I think if they haven't already, they need to start thinking about some kind of geo-filtering, or else accept they're going to have to have the most restrictive "lowest common denominator" rules possible or face being blocked in some countries.

  • (Score: 2) by jmoschner on Tuesday May 23 2017, @03:42PM (1 child)

    by jmoschner (3296) on Tuesday May 23 2017, @03:42PM (#514314)

    Facebook needs more moderators and more training for those mods. In a city like NY, you have about 40 police officers per 10k people. Facebook likely needs about that ratio of moderators to handle all the various disputes on the site. With 1.23 Billion users, Facebook would need nearly 5 million moderators to police the site. Even at 1 moderator per 10k users they would still need 123k moderators.

    Facebook has 4,500 moderators and plans to hire another 3,000 sometime "soon". That just isn't enough people to police their community.

    • (Score: 0) by Anonymous Coward on Tuesday May 23 2017, @03:51PM

      by Anonymous Coward on Tuesday May 23 2017, @03:51PM (#514325)

      We could be dicks and report every post... but that would involve us creating accounts.

(1)