Stories
Slash Boxes
Comments

SoylentNews is people

posted by mrpg on Tuesday November 20 2018, @06:25AM   Printer-friendly
from the is-this-good-or-bad? dept.

Submitted via IRC for takyon

Facebook Increasingly Reliant on A.I. To Predict Suicide Risk

A year ago, Facebook started using artificial intelligence to scan people's accounts for danger signs of imminent self-harm.

[...] "To just give you a sense of how well the technology is working and rapidly improving ... in the last year we've had 3,500 reports," she says. That means AI monitoring is causing Facebook to contact emergency responders an average of about 10 times a day to check on someone — and that doesn't include Europe, where the system hasn't been deployed. (That number also doesn't include wellness checks that originate from people who report suspected suicidal behavior online.)

Davis says the AI works by monitoring not just what a person writes online, but also how his or her friends respond. For instance, if someone starts streaming a live video, the AI might pick up on the tone of people's replies.

[...] "Ever since they've introduced livestreaming on their platform, they've had a real problem with people livestreaming suicides," Marks says. "Facebook has a real interest in stopping that."

He isn't sure this AI system is the right solution, in part because Facebook has refused to share key data, such as the AI's accuracy rate. How many of those 3,500 "wellness checks" turned out to be actual emergencies? The company isn't saying.


Original Submission

 
This discussion has been archived. No new comments can be posted.
Display Options Threshold/Breakthrough Mark All as Read Mark All as Unread
The Fine Print: The following comments are owned by whoever posted them. We are not responsible for them in any way.
  • (Score: 2) by darkfeline on Tuesday November 20 2018, @08:36PM (1 child)

    by darkfeline (1030) on Tuesday November 20 2018, @08:36PM (#764393) Homepage

    We the people are the ones who demanded Facebook to meddle. Not you or me personally, but the general public. Left to their own, Facebook has no motivation to predict suicide risk, but people seem to think think Facebook is responsible for fixing any and all social and political problems that touch their platform in any way.

    --
    Join the SDF Public Access UNIX System today!
    Starting Score:    1  point
    Karma-Bonus Modifier   +1  

    Total Score:   2  
  • (Score: 2) by Runaway1956 on Wednesday November 21 2018, @01:39AM

    by Runaway1956 (2926) Subscriber Badge on Wednesday November 21 2018, @01:39AM (#764512) Journal

    I suspect that you are probably right. From my perspective, it's easy to just blame Zuck for trying to be your whole life. But, yeah, people probably did motivate him to "do something". Should we call it coercion?