Stories
Slash Boxes
Comments

SoylentNews is people

posted by mrpg on Tuesday November 20 2018, @06:25AM   Printer-friendly
from the is-this-good-or-bad? dept.

Submitted via IRC for takyon

Facebook Increasingly Reliant on A.I. To Predict Suicide Risk

A year ago, Facebook started using artificial intelligence to scan people's accounts for danger signs of imminent self-harm.

[...] "To just give you a sense of how well the technology is working and rapidly improving ... in the last year we've had 3,500 reports," she says. That means AI monitoring is causing Facebook to contact emergency responders an average of about 10 times a day to check on someone — and that doesn't include Europe, where the system hasn't been deployed. (That number also doesn't include wellness checks that originate from people who report suspected suicidal behavior online.)

Davis says the AI works by monitoring not just what a person writes online, but also how his or her friends respond. For instance, if someone starts streaming a live video, the AI might pick up on the tone of people's replies.

[...] "Ever since they've introduced livestreaming on their platform, they've had a real problem with people livestreaming suicides," Marks says. "Facebook has a real interest in stopping that."

He isn't sure this AI system is the right solution, in part because Facebook has refused to share key data, such as the AI's accuracy rate. How many of those 3,500 "wellness checks" turned out to be actual emergencies? The company isn't saying.


Original Submission

 
This discussion has been archived. No new comments can be posted.
Display Options Threshold/Breakthrough Mark All as Read Mark All as Unread
The Fine Print: The following comments are owned by whoever posted them. We are not responsible for them in any way.
  • (Score: 3, Interesting) by tangomargarine on Tuesday November 20 2018, @08:27PM (1 child)

    by tangomargarine (667) on Tuesday November 20 2018, @08:27PM (#764386)

    I notice you didn't address the latter part of my argument. I was being somewhat rhetorical with the "who decides" part to make a point. Deciding that somebody no longer deserves to exercise their inalienable rights is not a thing to be taken lightly, no matter how many degrees and certifications you have. Specifically this case is the worst, because you're taking away the person's ability to opt out of the judgment.

    (Although come to think of it..."life, liberty, and the pursuit of happiness"...what if life and the pursuit of happiness are mutually exclusive at this point?)

    The evaluating expert may temporarily restrain someone if there is a genuine belief of threat to self [or] others just as a civilian can. Because above. But the professional still has to take the issue before a court and get a judge to certify that the person is not capable of making his or her own decisions. (Assuming the person doesn't voluntarily submit themselves, which can happen too).

    Being a threat to oneself vs. a threat to others are two very different things. To continue the abortion analogy, how would people feel about somebody rushing in and suddenly telling them that they can't get their abortion because some yahoo with a fancy title has decided that they know better than the woman, and has gotten her declared crazy so she can't have the abortion?

    Are psychiatrists ever going to find a suicidal person not crazy? Because if we're defining suicidal ideation as crazy to begin with, then this is a nullary argument and you're axiomatically correct. I don't accept that. What if the person has reasoned arguments for suicide? "I've been massively irradiated and it's going to take the next month for me to slowly, painfully fall apart and drown in my own fluids and there's nothing anybody can do to fix that."

    Thanks for the links.

    --
    "Is that really true?" "I just spent the last hour telling you to think for yourself! Didn't you hear anything I said?"
    Starting Score:    1  point
    Moderation   +1  
       Interesting=1, Total=1
    Extra 'Interesting' Modifier   0  
    Karma-Bonus Modifier   +1  

    Total Score:   3  
  • (Score: 2) by All Your Lawn Are Belong To Us on Wednesday November 21 2018, @12:14AM

    by All Your Lawn Are Belong To Us (6553) on Wednesday November 21 2018, @12:14AM (#764472) Journal

    You're welcome. And yes, I did skip over it a little bit. You are correct that removing the right to liberty, expressed in bioethics as the denial of patient autonomy, is not something to be taken lightly at all. And yes, the default model is that if one wants to kill oneself that this is an irrational act unless proven otherwise. But it is equally true that the removal of autonomy is something that opens up a provider to great liability risk, which is one reason that the decision of a court is sought at the earliest opportunity. At least the risk is spread between professional entities which are nominally at arms length from each other. And courts have taken notice that a patient with decision making ability does have that freedom and medical personnel who disregard that have been found liable for taking that away.

    I'm not sure that the difference you describe (self harm versus harm of others) is always that great, but I'll acknowledge that I could be wrong.

    The situation you have defined in your argument, though, is not necessarily an emergency, either. And yes, psychiatrists do indeed find people given the circumstance you described not crazy. In locations where medically assisted end of life is legal (the term I prefer instead of "physician assisted suicide" but I'm in favor of allowing it) the requirement usually is at least two separate psychological evaluations conducted at least 30 days apart from each other which determine that the patient is lucid, has the capacity to understand the decision, and is not suffering from any mental pathology which could be addressed which might change the patient's decision. (Not just depression - a person seeking assistance may be depressed by their condition but still possess capacity.) However, in that case there is an understandable driver to the desire. The opposite exists as well, where there either is no clearly definable driver or that the expressed reasons do not seem to comport with objective reality. At any rate, there are indeed times where mental health professionals will look at someone's request to end his or her life and recognize a rational reason to choose that path.

    --
    This sig for rent.