Facebook is expanding its limited test run for suicide- and -self-harm reporting tools to the masses. To get better at detection the social network will begin implementing pattern recognition for posts and Live videos to detect when someone could be presenting suicidal thoughts. From there, VP of product management Guy Rosen writes that the social network will also concentrate efforts to improve alerting first responders when the need arises. Facebook will also have more humans looking at posts flagged by its algorithms.
Currently the passive/AI detection tools are only available in the US, but soon those will roll out across the globe -- European Union countries notwithstanding. In the past month, Facebook has pinged over 100 first responders about potentially fatal posts, in addition to those that were reported by someone's friends and family.
Apparently, "Are you okay?" and "Can I help?" comments are good indicators that someone might be going through a very dark moment. More than that, Rosen says that thanks to the algorithms and those phrases, Facebook has picked up on videos that might've otherwise gone unnoticed prior.
"With all the fear about how AI may be harmful in the future, it's good to remind ourselves how AI is actually helping save people's lives today," CEO Mark Zuckerberg wrote in a post on the social network.
Source: https://www.engadget.com/2017/11/27/facebook-ai-suicide-prevention-tools/
(Score: 2) by All Your Lawn Are Belong To Us on Tuesday November 28 2017, @10:31PM (6 children)
Are you OK? Can I help?
This sig for rent.
(Score: 4, Informative) by edIII on Tuesday November 28 2017, @10:37PM (3 children)
Yeah, but according to the TFA, signs of empathy like that are also signs of suicidal thoughts and intentions.
Are you going thought a dark moment right now? Are you ok?
Technically, lunchtime is at any moment. It's just a wave function.
(Score: 3, Funny) by MostCynical on Tuesday November 28 2017, @10:43PM (1 child)
You just asked the question! Where do we send th help*?
* note: help may, or may not include, and not be limited to: medical professionals, SWAT team, auto-kill drones, medication, enforced time-out. All the listed or un-listed options will be provided at your expense, whether or not you use the services.
"I guess once you start doubting, there's no end to it." -Batou, Ghost in the Shell: Stand Alone Complex
(Score: 2) by tangomargarine on Wednesday November 29 2017, @03:31PM
We're really in trouble when the auto-medication drones show up. Remember that Doctor Who episode where Amy is hiding from the robots that are determinedly trying to give her a shot that's lethal to humans?
"Is that really true?" "I just spent the last hour telling you to think for yourself! Didn't you hear anything I said?"
(Score: 1, Insightful) by Anonymous Coward on Wednesday November 29 2017, @02:11PM
I see, the world is already so fucked up that you need to be a sociopath lacking any trace of empathy in order to not get suicidal thoughts.
(Score: 2) by Hyper on Wednesday November 29 2017, @12:41AM (1 child)
Great. Thanks. Next up: Facebook scans other social media sites too.
"Are you OK?!?" -sig of the year 2017
(Score: 0) by Anonymous Coward on Friday December 01 2017, @02:55AM
Are you going thought a dark moment right now? Are you ok?