Stories
Slash Boxes
Comments

SoylentNews is people

posted by janrinok on Sunday April 15 2018, @01:13PM   Printer-friendly
from the can-it-be-cured-by-medical-AI? dept.

Could artificial intelligence get depressed and have hallucinations?

As artificial intelligence (AI) allows machines to become more like humans, will they experience similar psychological quirks such as hallucinations or depression? And might this be a good thing?

Last month, New York University in New York City hosted a symposium called Canonical Computations in Brains and Machines, where neuroscientists and AI experts discussed overlaps in the way humans and machines think. Zachary Mainen, a neuroscientist at the Champalimaud Centre for the Unknown, a neuroscience and cancer research institute in Lisbon, speculated [36m video] that we might expect an intelligent machine to suffer some of the same mental problems people do.

[...] Q: Why do you think AIs might get depressed and hallucinate?

A: I'm drawing on the field of computational psychiatry, which assumes we can learn about a patient who's depressed or hallucinating from studying AI algorithms like reinforcement learning. If you reverse the arrow, why wouldn't an AI be subject to the sort of things that go wrong with patients?

Q: Might the mechanism be the same as it is in humans?

A: Depression and hallucinations appear to depend on a chemical in the brain called serotonin. It may be that serotonin is just a biological quirk. But if serotonin is helping solve a more general problem for intelligent systems, then machines might implement a similar function, and if serotonin goes wrong in humans, the equivalent in a machine could also go wrong.

Related: Do Androids Dream of Electric Sheep?


Original Submission

 
This discussion has been archived. No new comments can be posted.
Display Options Threshold/Breakthrough Mark All as Read Mark All as Unread
The Fine Print: The following comments are owned by whoever posted them. We are not responsible for them in any way.
  • (Score: 4, Insightful) by SomeGuy on Sunday April 15 2018, @03:51PM (8 children)

    by SomeGuy (5632) on Sunday April 15 2018, @03:51PM (#667286)

    Exactly, machines are still machines. Any human qualities they appear to have are simply programmed. They don't think like humans and will never have the same emotions as humans simply because they are not human.

    Machines don't get happy, they don't get sad, they don't get angry, they don't laugh at your lame jokes; THEY JUST RUN PROGRAMS!

    While it is fun for nerds to think about, this kind of speculative "news" is the sort of stuff that puts unrealistic expectations in average people's heads about what machines can do.

    Starting Score:    1  point
    Moderation   +2  
       Insightful=1, Disagree=1, Touché=1, Total=3
    Extra 'Insightful' Modifier   0  
    Karma-Bonus Modifier   +1  

    Total Score:   4  
  • (Score: 0, Touché) by Anonymous Coward on Sunday April 15 2018, @04:26PM (1 child)

    by Anonymous Coward on Sunday April 15 2018, @04:26PM (#667299)

    Gee, what an obtuse response. It's like you've never heard of neuromorphic architectures or brain emulation.

    What are you again? A biological machine with a meatbag superiority complex. Consciousness will be replicated in a decade or two, assuming the military hasn't already done it.

    • (Score: 3, Insightful) by SomeGuy on Monday April 16 2018, @03:32PM

      by SomeGuy (5632) on Monday April 16 2018, @03:32PM (#667660)

      There seems to be a strange amount of disagreement here. First of all, that line was from the comedy movie "Short Circuit". It was supposed to be funny. (Perhaps you are an IA and can't laugh at my lame joke? :) )

      There is the very real problem that human emotions are the result of complex bio-chemical reactions. Can it be emulated on a silicon chip? Given enough processing and electrical power, sure. But it is still emulation. Ask any MAME user, even emulating silicon on silicon loses something. And then there is the bigger problem: What practical purpose does that serve?. There may be a few narrow niche answers such as better understanding the human condition, but I hope no one would want to ride in a self driving car that genuinely, consciously hates them.

      Then there is the more general problem with "AI": What it "learns" is potentially garbage that just happens to do what is expected. There was an interesting Isaac Asimov short story entitled "Reason" that illustrates this quite well. In this story robots are in charge of a power source that could destroy the entire Earth. The robots malfunction and develop a religion around what they do, yet in the end they appear to perform their job perfectly even though they do everything only because of their religion.

      Even if it works, should you trust it? How does it behave when the unexpected happens? What about edge cases that weren't explicitly tested for? Can you be sure it will behave consistently in all circumstances?

      All of that still doesn't change the fact that the computers in use today revolve around the classic Von Neumann architecture. If there are any "neuromorphic architecture" computers, or such, in production anywhere, please post factual details. I very well may have missed the memo. But it is pointless to speculate what people like the military may be using in secret. They MAY be using alien technology too, you can't prove otherwise.

      Which actually brings me to another issue. Emotions are very human-specific. They have evolved over billions of years and are share by some, but not all, animals on this planet. An alien species would likely have a completely different set of "emotions". They may not be able to laugh or cry, but could still be intelligent and even sentient. The point is, emotions are not necessarily needed, and the desire to place such emotions on computers is simply anthropomorphizing.

      Specifically, depression exists in animals as an indirect way to eliminate underperformed members. Those members that can not meet their goals such as collecting food or reproducing may become "depressed", lethargic or slowing down enabling predators to more easily eliminate them, or encourage seeking out more risky activities such as more dangerous paths to collect food. The trait is a group trait and therefore passed on by the surviving group that benefits from the removal of the individual. What would be the logic of emulating this in program code? I would think there would be much more efficient direct algorithms.

      Anyway, "AI" has been a marketing buzzword for a very, very long time now. Yet, like flying cars, it has yet to deliver anything meaningful. It always takes time for younger generations to become callous toward such buzzwords, so this idea will continue to get thrown around. Of course, you could just sell an empty box with the letters "AI" printed on the front with a bunch of flashing blue LEDs and most idiots would happily buy it.

  • (Score: 3, Funny) by cellocgw on Sunday April 15 2018, @04:42PM

    by cellocgw (4190) on Sunday April 15 2018, @04:42PM (#667306)

    Machines don't get happy, they don't get sad, they don't get angry, they don't laugh at your lame jokes; THEY JUST RUN PROGRAMS!

    Nice, you just hurt my computer's feelings bigly.
    It is sulkiing and won't play with my tablet any more.

    --
    Physicist, cellist, former OTTer (1190) resume: https://app.box.com/witthoftresume
  • (Score: 1, Interesting) by Anonymous Coward on Sunday April 15 2018, @05:00PM (4 children)

    by Anonymous Coward on Sunday April 15 2018, @05:00PM (#667311)

    No, seriously, your response is the basic boilerplate answer for normies who think their x86 chip is like a human brain. But it is not a very rigorous answer and does not take into account new architectures. There's even talk of recurrent neural networks exhibiting real intelligence. If that is anywhere near true, they could probably exhibit something like depression as well.

    Your human brain is a machine. Its functionality can likely be copied using nonbiological components. If it happens, it could be kept a secret for years to maintain a serious competitive or military advantage.

    • (Score: 1, Insightful) by Anonymous Coward on Sunday April 15 2018, @05:48PM (3 children)

      by Anonymous Coward on Sunday April 15 2018, @05:48PM (#667327)

      And YOUR response sounds like a boilerplate answer for for someone who is trying to sell this hocus-pocus IA crap. Do people really needs machines that can feel a genuine sense of satisfaction whenever they do their jobs well? Or get depressed when they don't? Seriously, what does your precious "AI" do that humanity actually NEEDS? All I see it ever used for is a programing shortcut. Have some complex problem? Just throw "AI" at it! Just beat it like a dog until it does what you want 99.99% of the time, but no one has any way to know what it has really "learned" or what it will do in unexpected outlier cases. Oh, sure pedantically one could pull apart and audit every bit, but no one does that. And if they did, it would no longer be "artificial" intelligence.

      There is no substitute for properly engineered, audited program code.

      • (Score: 0) by Anonymous Coward on Sunday April 15 2018, @06:01PM (1 child)

        by Anonymous Coward on Sunday April 15 2018, @06:01PM (#667331)

        NEED? Try WANT. It could be anything from a genius scientist, sleepless artist, to purely slave labor.

        It doesn't need to substitute for your "properly engineered" (yeah right), badly documented program code. It can work alongside it or write code itself.

        • (Score: 0) by Anonymous Coward on Sunday April 15 2018, @08:10PM

          by Anonymous Coward on Sunday April 15 2018, @08:10PM (#667372)

          "It could be anything"

          Spoken like a true brainwashed marketoid. Any product can always do ANYTHING!

      • (Score: 2) by acid andy on Sunday April 15 2018, @06:10PM

        by acid andy (1683) on Sunday April 15 2018, @06:10PM (#667339) Homepage Journal

        Seriously, what does your precious "AI" do that humanity actually NEEDS?

        Eventually, hopefully, it thinks much faster than humans and maybe even with greater intelligence.

        --
        If a cat has kittens, does a rat have rittens, a bat bittens and a mat mittens?