Stories
Slash Boxes
Comments

SoylentNews is people

posted by janrinok on Tuesday November 28 2017, @08:47PM   Printer-friendly
from the if-you're-happy-and-you-know-it... dept.

Facebook is expanding its limited test run for suicide- and -self-harm reporting tools to the masses. To get better at detection the social network will begin implementing pattern recognition for posts and Live videos to detect when someone could be presenting suicidal thoughts. From there, VP of product management Guy Rosen writes that the social network will also concentrate efforts to improve alerting first responders when the need arises. Facebook will also have more humans looking at posts flagged by its algorithms.

Currently the passive/AI detection tools are only available in the US, but soon those will roll out across the globe -- European Union countries notwithstanding. In the past month, Facebook has pinged over 100 first responders about potentially fatal posts, in addition to those that were reported by someone's friends and family.

Apparently, "Are you okay?" and "Can I help?" comments are good indicators that someone might be going through a very dark moment. More than that, Rosen says that thanks to the algorithms and those phrases, Facebook has picked up on videos that might've otherwise gone unnoticed prior.

"With all the fear about how AI may be harmful in the future, it's good to remind ourselves how AI is actually helping save people's lives today," CEO Mark Zuckerberg wrote in a post on the social network.

Source: https://www.engadget.com/2017/11/27/facebook-ai-suicide-prevention-tools/


Original Submission

 
This discussion has been archived. No new comments can be posted.
Display Options Threshold/Breakthrough Mark All as Read Mark All as Unread
The Fine Print: The following comments are owned by whoever posted them. We are not responsible for them in any way.
  • (Score: 3, Insightful) by Anonymous Coward on Tuesday November 28 2017, @10:52PM (3 children)

    by Anonymous Coward on Tuesday November 28 2017, @10:52PM (#602716)

    No need to change the subject to see how this is dangerous. Behold:

    These are triggered I guess by a lot of people posting comments like "Are you okay?" and "Can I help?" to someone's content. And they'll soon be tied into the first responder network, essentially calling 911 on/for people.

    It doesn't take a brilliant mind to realize how to abuse this. Anyone you - or perhaps 4chan - doesn't like suddenly gets an unexpected flood of comments along those lines, and suddenly the police/EMS are at their door. They get justifiably upset, taken into custody or medically committed because they 'are a risk to their own safety.'

    Scary stuff. Glad I deleted my FB account long past.

    Starting Score:    0  points
    Moderation   +3  
       Insightful=2, Interesting=1, Total=3
    Extra 'Insightful' Modifier   0  

    Total Score:   3  
  • (Score: 5, Insightful) by Grishnakh on Tuesday November 28 2017, @11:03PM

    by Grishnakh (2831) on Tuesday November 28 2017, @11:03PM (#602729)

    Scary stuff. Glad I deleted my FB account long past.

    That's no problem. Since you don't have an account there, someone who wants to get you committed just needs to create an account for you on FB, then make disturbing posts and have their confederates write responses saying "Are you OK?"

  • (Score: 5, Informative) by frojack on Tuesday November 28 2017, @11:06PM

    by frojack (1554) on Tuesday November 28 2017, @11:06PM (#602733) Journal

    Don't forget that Mentally Ill people are way more likely to be shot by police.

    https://www.usatoday.com/story/news/2015/12/10/people-mental-illness-16-times-more-likely-killed-police/77059710/ [usatoday.com]

    --
    No, you are mistaken. I've always had this sig.
  • (Score: 3, Touché) by Anonymous Coward on Tuesday November 28 2017, @11:07PM

    by Anonymous Coward on Tuesday November 28 2017, @11:07PM (#602734)

    "You don't seem to have a Facebook account. Are you okay?"