Stories
Slash Boxes
Comments

SoylentNews is people

posted by Fnord666 on Saturday January 14 2017, @06:32PM   Printer-friendly
from the who-has-the-mental-bleach dept.

Microsoft is being sued for allegedly not doing enough to help two of its employees who were suffering while on "porn detail":

Two former Microsoft employees who were responsible for monitoring child pornography and other criminal material have filed a lawsuit against the company, The Daily Beast reports, alleging that they were not provided with psychological support to treat post-traumatic stress disorder (PTSD).

The employees, Henry Soto and Greg Blauert, were part of Microsoft's Online Safety Team, where they were charged with reviewing material that had been flagged as potentially illegal. According to the lawsuit, Soto's job involved viewing "horrible brutality, murder, indescribable sexual assaults," and other content "designed to entertain the most twisted and sick-minded people in the world." Blauert had to "review thousands of images of child pornography, adult pornography and bestiality that graphically depicted the violence and depravity of the perpetrators," according to the complaint.

Both men say they suffered "vicarious trauma" and symptoms associated with PTSD, including nightmares, anxiety, and hallucinations. When they complained about their health, Microsoft offered a "Wellness Program," but the suit alleges that the therapist involved with the program was not qualified to treat their symptoms. Program supervisors also advised them to take smoke breaks and walks to deal with their problems, while Blauert was advised to play more video games, according to the complaint.

Also at BBC, The Guardian, and Courthouse News. Courthouse News also has a copy of the lawsuit.


Original Submission

 
This discussion has been archived. No new comments can be posted.
Display Options Threshold/Breakthrough Mark All as Read Mark All as Unread
The Fine Print: The following comments are owned by whoever posted them. We are not responsible for them in any way.
  • (Score: 2) by Bot on Saturday January 14 2017, @07:33PM

    by Bot (3902) on Saturday January 14 2017, @07:33PM (#453880) Journal

    get child pornography convicts to do those kind of jobs. It might be even boring for them.

    Or I dunno, recruit suitable people in the appropriate chan.

    Of course i know we bots will end up taking up that shitty job in the end. Get an AI, they said, it will be fun with an AI, they said.

    --
    Account abandoned.
    Starting Score:    1  point
    Karma-Bonus Modifier   +1  

    Total Score:   2  
  • (Score: 2) by takyon on Saturday January 14 2017, @07:40PM

    by takyon (881) <takyonNO@SPAMsoylentnews.org> on Saturday January 14 2017, @07:40PM (#453883) Journal

    Sensible solutions are NOT ALLOWED when child porn enters the picture.

    --
    [SIG] 10/28/2017: Soylent Upgrade v14 [soylentnews.org]
    • (Score: 0) by Anonymous Coward on Saturday January 14 2017, @08:31PM

      by Anonymous Coward on Saturday January 14 2017, @08:31PM (#453904)

      Is that a sensible solution though? Looking at pictures of kiddie porn tends to reinforce the urges. Plus, there's a huge conflict of interest there for erring on the side of leniency there were pictures that could go either way would likely be evaluated as OK for reasons related to not wanting to screw things up for other folks.

      • (Score: 2) by takyon on Saturday January 14 2017, @10:25PM

        by takyon (881) <takyonNO@SPAMsoylentnews.org> on Saturday January 14 2017, @10:25PM (#453931) Journal

        Solutions won't be evaluated or implemented. Kind of like the problem of studying cannabis [soylentnews.org], but much, much worse.

        --
        [SIG] 10/28/2017: Soylent Upgrade v14 [soylentnews.org]
      • (Score: 2) by Anal Pumpernickel on Saturday January 14 2017, @11:59PM

        by Anal Pumpernickel (776) on Saturday January 14 2017, @11:59PM (#453959)

        Or just drop the laws against mere possession and go after the actual producers.

        Looking at pictures of kiddie porn tends to reinforce the urges.

        Or maybe it stimulates urges that already exist in these people. You said "reinforce", but I'm not quite sure what that means exactly in this context.

        • (Score: 2) by choose another one on Sunday January 15 2017, @09:14AM

          by choose another one (515) Subscriber Badge on Sunday January 15 2017, @09:14AM (#454044)

          > Or just drop the laws against mere possession and go after the actual producers.

          I thought the majority if CP these days was distributed by law enforcement (FBI mostly?) to trap the consumers - you are saying law enforcement should go after themselves?

          I suppose the PTSD sufferers could go after law enforcement as well as MS, but MS is probably a larger and easier target.

  • (Score: 2) by bradley13 on Saturday January 14 2017, @07:56PM

    by bradley13 (3053) on Saturday January 14 2017, @07:56PM (#453886) Homepage Journal

    "Get an AI"

    That really is the way it should go, and probably is going. Of course, the AI will make mistakes, but any sort of decent accuracy will reduce the load. People will only need to review the pics, if someone complains that the AI has made a mistake.

    I also wonder: just how many illegal pics to these people have to review? Not counting false alarms - how many actually illegal ones? Are we talking 10 a day? 100? 1000? Anyone have a solid idea?

    --
    Everyone is somebody else's weirdo.
    • (Score: 2) by takyon on Saturday January 14 2017, @08:00PM

      by takyon (881) <takyonNO@SPAMsoylentnews.org> on Saturday January 14 2017, @08:00PM (#453888) Journal

      Machine learning AI could make things worse by removing the innocent pics from the flagged items pool, and leaving only naked, bloody, or petrified human bodies for the censors to look at.

      --
      [SIG] 10/28/2017: Soylent Upgrade v14 [soylentnews.org]
      • (Score: 2) by Bot on Saturday January 14 2017, @08:30PM

        by Bot (3902) on Saturday January 14 2017, @08:30PM (#453901) Journal

        BTW facebook recently started tagging all photos, probably for blind (aka visually impaired, aka differently light perceiving or whatever it is now) people. Must be AI driven, look at html source of the IMG tag of the photos.

        --
        Account abandoned.
      • (Score: 2) by Bot on Monday January 16 2017, @12:53AM

        by Bot (3902) on Monday January 16 2017, @12:53AM (#454212) Journal

        You mean naked petrified and with hot grits, didn't you?

        --
        Account abandoned.
    • (Score: 1) by Ethanol-fueled on Saturday January 14 2017, @09:04PM

      by Ethanol-fueled (2792) on Saturday January 14 2017, @09:04PM (#453916) Homepage

      A reputable former coworker told me about a time that he had found physical CP (don't recall if it was printed or photos) and reported it. He then said that a postal inspector showed up and looked at the images, and the postal inspector said that they have a quick and easy test for distinguishing CP from innocent pictures: turn the picture upside down, and if the focus of the picture is still sexual in nature, then they consider it CP (of course it has to go to trial and all that, but its their quick "field-test").

      Turns out the guy it belonged to was convicted of multiple sexual offenses in the past, but was hired by the company before they did background checks. Incidentally, there was another sex-offender in the company who got in in a similar manner, but they kept him on because it was a he said-she said kind of thing (albeit with a 17 year old subordinate and the act was an attempt but not carried out) -- You can look him up on the Megan's law website and everything -- he's even smiling in his Megan's Law photo.

      • (Score: 0) by Anonymous Coward on Saturday January 14 2017, @10:17PM

        by Anonymous Coward on Saturday January 14 2017, @10:17PM (#453929)

        So you're the perverted fuck who watches family movies upside down while masturbating furiously.

  • (Score: 2) by Azuma Hazuki on Sunday January 15 2017, @12:07AM

    by Azuma Hazuki (5086) on Sunday January 15 2017, @12:07AM (#453962) Journal

    Yeah, but since you're a bot, seeing what we meatbags do to other meatbags won't traumatize you. I suggest you stay off /r/techsupportgore though...

    --
    I am "that girl" your mother warned you about...