Facebook to pay $52m to moderators over PTSD:
Facebook has agreed to pay $52m (£42m) to content moderators as compensation for mental health issues developed on the job.
The agreement settles a class-action lawsuit brought by the moderators, as first reported by The Verge.
Facebook said it is using both humans and artificial intelligence (AI) to detect posts that violate policies.
The social media giant has increased its use of AI to remove harmful content during the coronavirus lockdown.
In 2018, a group of US moderators hired by third-party companies to review content sued Facebook for failing to create a safe work environment.
The moderators alleged that reviewing violent and graphic images - sometimes of rape and suicide - for the social network had led to them developing post-traumatic stress disorder (PTSD).
The agreement, filed in court in California on Friday, settles that lawsuit. A judge is expected to sign off on the deal later this year.
(Score: 5, Informative) by Anonymous Coward on Wednesday May 13 2020, @04:49PM
You are correct. You have proven you have no idea what PTSD is.
It is my pleasure to educate you as to the diagnostic criteria [va.gov]:
A4, any of the B symptoms, C1, D2-7, any E, F, G, and H may well be possible in someone whose job it is to be exposed to such traumas. The real question would be what Facebook did to try and ameliorate or mitigate the problems at the time. What services did they provide, and if someone had too much did they try and meaningfully address the problem by finding other work for the reviewers? My guess would be not enough and no.