Stories
Slash Boxes
Comments

SoylentNews is people

posted by martyb on Tuesday May 07 2019, @07:48AM   Printer-friendly
from the Can't-touch-this!-♫♬ dept.

Content Moderation At Scale Is Impossible: Facebook Still Can't Figure Out How To Deal With Naked Breasts:

[...] Going back over a decade, the quintessential example used to show the impossibility of coming up with clear, reasonable rules for content moderation at scale is Facebook and breasts. In the early days, as Facebook realized it needed to do some content moderation, and had to establish a clear set of rules that could be applied consistently by a larger team, it started with a simple "no nudity" policy -- and then after that raised questions, it was narrowed down to define female nipples as forbidden.

[...] This might have seemed like a straightforward rule... until mothers posting breastfeeding photos started complaining -- as they did after a bunch of their photos got blocked. Stories about this go back at least until 2008 when the Guardian reported on the issue, after a bunch of mothers started protesting the company, leading Facebook to come up with this incredibly awkward statement defending the practice:

"Photos containing a fully exposed breast, as defined by showing the nipple or areola, do violate those terms (on obscene, pornographic or sexually explicit material) and may be removed," he said in a statement. "The photos we act upon are almost exclusively brought to our attention by other users who complain."

More public pressure, and more public protests, resulted in Facebook adjusting its policy to allow breastfeeding, but photos still kept getting taken down, leading the company to have to keep changing and clarifying its policy, such as in this statement from 2012.

[...] In 2014, Facebook clarified its policies on nipples again:

"Our goal has always been to strike an appropriate balance between the interests of people who want to express themselves with the interests of others who may not want to see certain kinds of content," a Facebook spokesperson told the Daily Dot. "It is very hard to consistently make the right call on every photo that may or may not contain nudity that is reported to us, particularly when there are billions of photos and pieces of content being shared on Facebook every day, and that has sometimes resulted in content being removed mistakenly.

"What we have done is modified the way we review reports of nudity to help us better examine the context of the photo or image," the spokesperson continued. "As a result of this, photos that show a nursing mothers' other breast will be allowed even if it is fully exposed, as will mastectomy photos showing a fully exposed other breast."

Right. And then, just a few months later, people started protesting again, as more breastfeeding photos were taken down.

[...] Late last week there were reports in Australia of some (reasonably) outraged people, who were angry that Facebook was taking down a series of ads for breast cancer survivors.

[...] As the article notes, the ads showed "10 topless breast cancer survivors holding cupcakes to their chests". In another article Facebook gives its reasoning, which again reflects much of the history discussed above:

Facebook said it rejected the ads because they did not contain any education about the disease or teach women how to examine their breasts.

It said since the ads were selling a product, they were held to a higher standard than other images because people could not block ads the way they could block content from pages they followed.

So, clearly, over time the rule has evolved so that there's some sort of amendment saying that there needs to be an educational component if you're showing breasts related to breast cancer (remember, above, years back, Facebook had already declared that mastectomy photos are okay, and at least some of these ads do show post-mastectomy photos).

[...] And those rules will never encompass every possible situation, and we'll continue to see stories like this basically forever. We keep saying that content moderation at scale is impossible to do well, and part of that is because of stories like this. You can't create rules that work in every case, and there are more edge cases than you can possibly imagine.


Original Submission

 
This discussion has been archived. No new comments can be posted.
Display Options Threshold/Breakthrough Mark All as Read Mark All as Unread
The Fine Print: The following comments are owned by whoever posted them. We are not responsible for them in any way.
  • (Score: 1, Insightful) by Anonymous Coward on Tuesday May 07 2019, @11:55AM (4 children)

    by Anonymous Coward on Tuesday May 07 2019, @11:55AM (#840103)

    Just allow everything.

    * It's not open to interpretation.
    * It's cheap to implement (no human operators required or software development/maintenance)
    * It can't be politically biased
    * It can't be used to censor some but not others

    Anyone that objects to this can just go ahead and disable the "show images (friends|friends of friends|everyone)" feature.

    Problem fucking solved.

    Starting Score:    0  points
    Moderation   +1  
       Insightful=1, Total=1
    Extra 'Insightful' Modifier   0  

    Total Score:   1  
  • (Score: -1, Flamebait) by Anonymous Coward on Tuesday May 07 2019, @12:34PM

    by Anonymous Coward on Tuesday May 07 2019, @12:34PM (#840116)

    The Moslems won't be happy when you start posting pictures of bacon, and then they will blow up Facebook's HQ along with all its employees... oh wait, I guess your solution is the optimal one.

  • (Score: 3, Touché) by Rosco P. Coltrane on Tuesday May 07 2019, @01:27PM (2 children)

    by Rosco P. Coltrane (4757) on Tuesday May 07 2019, @01:27PM (#840129)

    Okay smartass: kiddie porn okay then?

    Oops, one exception already...

    • (Score: 0) by Anonymous Coward on Tuesday May 07 2019, @06:27PM (1 child)

      by Anonymous Coward on Tuesday May 07 2019, @06:27PM (#840328)

      Okay smartass: kiddie porn okay then?

      For posting on something as facebook? Yes, that's probably the best solution. I'm not the OP, but it would be interesting to see how his idea performs next to the current one. In addition I would add a button to allow for reporting illegal posts. A small hurdle would need to be included such that reporting on something illegal takes a little bit of time. That should scare off illegal shit, as facebook can probably give the cops plenty of information on who-ever posted the illegal stuff.

      The problem with kiddie porn is that my photo of my cute young children in the bathtub, if i would share it with some friends or praents would not be kiddie porn. If I would share it in some porn channel or wherever the pervs go, that would be kiddie porn.
      When my daughter and son grow up, at some point they are probably going to have sexual desires before they reach the legal sex age in any and all legal districts.

      More to the point, general child sex laws can vary wildly over countries and districts, and much of it is morality based. Which is exactly the dilemma that the filters can't handle. Morality. Some breastphotos are ok, others are not, its not black and white.

      • (Score: 0) by Anonymous Coward on Tuesday May 07 2019, @10:31PM

        by Anonymous Coward on Tuesday May 07 2019, @10:31PM (#840480)

        The problem with kiddie porn is that my photo of my cute young children in the bathtub, if i would share it with some friends or praents would not be kiddie porn. If I would share it in some porn channel or wherever the pervs go, that would be kiddie porn.

        That shouldn't be treated as child porn, but why would someone share that? Parents really shouldn't be sharing photos of their children online at all, because it's violating someone else's privacy (their child's) before they even know what's happening. As someone who grew up to be a privacy advocate, I would be furious if my parents had handed over all my information - including facial recognition data - to mega corporations before I even had a say in the matter.