A BBC investigation found 100 "sexualised images of children" on Facebook. Auntie Beeb reported the images to Facebook, who found over 80% of them to be "not in breach of their guidelines" - despite one of them including a still from a child abuse video with a label requesting viewers "share child pornography."
The twist is that when the BBC followed up on this failure, Facebook reported the BBC to the Child Exploitation and Online Protection Centre for "distributing images of child exploitation".
How can Facebook expect users to help them police their content when reporting abuse gets the users accused of the abuses they are reporting?
Alternate articles:
(Score: 4, Insightful) by BsAtHome on Wednesday March 08 2017, @07:46PM (3 children)
It is always easier to blame the messenger. Blame someone else and just repeat it often enough. The blame game is an effective game of disinformation. It does not help to solve the problem, but at least it is SEP(*) now.
(*) Please read the really good book for this particular phenomenon.
(Score: 0) by Anonymous Coward on Wednesday March 08 2017, @08:14PM
Its stupid but is this is a pattern or a one-off?
Because it sounds like it was just a drone mindlessly following "the rules."
You don't have to be a computer to be a robot.
(Score: 2) by stretch611 on Wednesday March 08 2017, @08:22PM
Blame someone else and just repeat it often enough.
Exactly. Facebook failed to take action when the posts were originally submitted for review. So instead of admitting that they screwed up, they decided to report the BBC when they tried to report the posts a different way.
Now with 5 covid vaccine shots/boosters altering my DNA :P
(Score: 5, Insightful) by Scruffy Beard 2 on Wednesday March 08 2017, @09:07PM
- IFPI’s child porn strategy [wordpress.com]
It is very dangerous to have information deemed illegal by it's mere existence.