Facebook is expanding its limited test run for suicide- and -self-harm reporting tools to the masses. To get better at detection the social network will begin implementing pattern recognition for posts and Live videos to detect when someone could be presenting suicidal thoughts. From there, VP of product management Guy Rosen writes that the social network will also concentrate efforts to improve alerting first responders when the need arises. Facebook will also have more humans looking at posts flagged by its algorithms.
Currently the passive/AI detection tools are only available in the US, but soon those will roll out across the globe -- European Union countries notwithstanding. In the past month, Facebook has pinged over 100 first responders about potentially fatal posts, in addition to those that were reported by someone's friends and family.
Apparently, "Are you okay?" and "Can I help?" comments are good indicators that someone might be going through a very dark moment. More than that, Rosen says that thanks to the algorithms and those phrases, Facebook has picked up on videos that might've otherwise gone unnoticed prior.
"With all the fear about how AI may be harmful in the future, it's good to remind ourselves how AI is actually helping save people's lives today," CEO Mark Zuckerberg wrote in a post on the social network.
Source: https://www.engadget.com/2017/11/27/facebook-ai-suicide-prevention-tools/
(Score: 3, Funny) by Anonymous Coward on Tuesday November 28 2017, @09:20PM (9 children)
*check if user is in danger of suicide*
*user uses Facebook regularly, user is likely in danger of suicide*
*sounds alarm*
(Score: 3, Funny) by krishnoid on Tuesday November 28 2017, @09:28PM (4 children)
*also temporarily shuts user out of Facebook for two days*
(Score: 3, Insightful) by frojack on Tuesday November 28 2017, @10:59PM (3 children)
Or just fails to log into Facebook for two days. !!!
That's what Zuck fears most.
No, you are mistaken. I've always had this sig.
(Score: 2) by Phoenix666 on Wednesday November 29 2017, @09:41AM (2 children)
Exactly. If you haven't posted on Facebook or even logged in in a very long time, you clearly have no contact with friends and family and must be depressed. Wait until they get a law passed that says they can commit you to a facility for monitoring on suicide watch if you don't regularly post on Facebook.
Washington DC delenda est.
(Score: 0) by Anonymous Coward on Wednesday November 29 2017, @02:04PM (1 child)
Far too expensive. Just mandate cameras to be installed all over their homes, to early-detect any preparations of suicide. Of course the cameras are all connected to Facebook, so that their suicide risk detection AI can do its job better. Oh, and they better don't do anything to disable those cameras or restrict the view … only for their own best, of course.
(Score: 0) by Anonymous Coward on Thursday November 30 2017, @04:31AM
Please, can I just send in my nudes instead?
(Score: 2, Insightful) by Anonymous Coward on Tuesday November 28 2017, @09:30PM (3 children)
Zuckerberg is just trying to maintain his user base. Every user who offs him or herself causes a loss of revenue.
(Score: 2) by DannyB on Tuesday November 28 2017, @09:36PM (2 children)
Then wouldn't it be better for Facebook to develop a way to effectively turn the entire user base into distracted robot like sloths?
People today are educated enough to repeat what they are taught but not to question what they are taught.
(Score: 2) by frojack on Tuesday November 28 2017, @11:01PM
After years of getting them to be posting dervishes, you expect Zuck to now switch course?
No, you are mistaken. I've always had this sig.
(Score: 0) by Anonymous Coward on Wednesday November 29 2017, @07:35AM
What do you mean "develop"?