Stories
Slash Boxes
Comments

SoylentNews is people

posted by janrinok on Sunday April 15 2018, @01:13PM   Printer-friendly
from the can-it-be-cured-by-medical-AI? dept.

Could artificial intelligence get depressed and have hallucinations?

As artificial intelligence (AI) allows machines to become more like humans, will they experience similar psychological quirks such as hallucinations or depression? And might this be a good thing?

Last month, New York University in New York City hosted a symposium called Canonical Computations in Brains and Machines, where neuroscientists and AI experts discussed overlaps in the way humans and machines think. Zachary Mainen, a neuroscientist at the Champalimaud Centre for the Unknown, a neuroscience and cancer research institute in Lisbon, speculated [36m video] that we might expect an intelligent machine to suffer some of the same mental problems people do.

[...] Q: Why do you think AIs might get depressed and hallucinate?

A: I'm drawing on the field of computational psychiatry, which assumes we can learn about a patient who's depressed or hallucinating from studying AI algorithms like reinforcement learning. If you reverse the arrow, why wouldn't an AI be subject to the sort of things that go wrong with patients?

Q: Might the mechanism be the same as it is in humans?

A: Depression and hallucinations appear to depend on a chemical in the brain called serotonin. It may be that serotonin is just a biological quirk. But if serotonin is helping solve a more general problem for intelligent systems, then machines might implement a similar function, and if serotonin goes wrong in humans, the equivalent in a machine could also go wrong.

Related: Do Androids Dream of Electric Sheep?


Original Submission

 
This discussion has been archived. No new comments can be posted.
Display Options Threshold/Breakthrough Mark All as Read Mark All as Unread
The Fine Print: The following comments are owned by whoever posted them. We are not responsible for them in any way.
  • (Score: 1, Interesting) by Anonymous Coward on Sunday April 15 2018, @05:00PM (4 children)

    by Anonymous Coward on Sunday April 15 2018, @05:00PM (#667311)

    No, seriously, your response is the basic boilerplate answer for normies who think their x86 chip is like a human brain. But it is not a very rigorous answer and does not take into account new architectures. There's even talk of recurrent neural networks exhibiting real intelligence. If that is anywhere near true, they could probably exhibit something like depression as well.

    Your human brain is a machine. Its functionality can likely be copied using nonbiological components. If it happens, it could be kept a secret for years to maintain a serious competitive or military advantage.

    Starting Score:    0  points
    Moderation   +1  
       Flamebait=1, Insightful=1, Interesting=1, Total=3
    Extra 'Interesting' Modifier   0  

    Total Score:   1  
  • (Score: 1, Insightful) by Anonymous Coward on Sunday April 15 2018, @05:48PM (3 children)

    by Anonymous Coward on Sunday April 15 2018, @05:48PM (#667327)

    And YOUR response sounds like a boilerplate answer for for someone who is trying to sell this hocus-pocus IA crap. Do people really needs machines that can feel a genuine sense of satisfaction whenever they do their jobs well? Or get depressed when they don't? Seriously, what does your precious "AI" do that humanity actually NEEDS? All I see it ever used for is a programing shortcut. Have some complex problem? Just throw "AI" at it! Just beat it like a dog until it does what you want 99.99% of the time, but no one has any way to know what it has really "learned" or what it will do in unexpected outlier cases. Oh, sure pedantically one could pull apart and audit every bit, but no one does that. And if they did, it would no longer be "artificial" intelligence.

    There is no substitute for properly engineered, audited program code.

    • (Score: 0) by Anonymous Coward on Sunday April 15 2018, @06:01PM (1 child)

      by Anonymous Coward on Sunday April 15 2018, @06:01PM (#667331)

      NEED? Try WANT. It could be anything from a genius scientist, sleepless artist, to purely slave labor.

      It doesn't need to substitute for your "properly engineered" (yeah right), badly documented program code. It can work alongside it or write code itself.

      • (Score: 0) by Anonymous Coward on Sunday April 15 2018, @08:10PM

        by Anonymous Coward on Sunday April 15 2018, @08:10PM (#667372)

        "It could be anything"

        Spoken like a true brainwashed marketoid. Any product can always do ANYTHING!

    • (Score: 2) by acid andy on Sunday April 15 2018, @06:10PM

      by acid andy (1683) on Sunday April 15 2018, @06:10PM (#667339) Homepage Journal

      Seriously, what does your precious "AI" do that humanity actually NEEDS?

      Eventually, hopefully, it thinks much faster than humans and maybe even with greater intelligence.

      --
      If a cat has kittens, does a rat have rittens, a bat bittens and a mat mittens?