Stories
Slash Boxes
Comments

SoylentNews is people

posted by Fnord666 on Friday February 22 2019, @05:59PM   Printer-friendly
from the do-androids-dream-of-electric-sheep? dept.

Submitted via IRC for Bytram

Sophisticated New AI Performs Better When It Can Sleep And Dream

Sleep is pretty great. In humans, evidence suggests it has a whole range of benefits, including this one: it keeps the brain healthy by letting neurons prune unnecessary synaptic connections we make during the day.

This process, called synaptic homeostasis, prevents the brain from being overrun by useless memories. It's possible that it helps to improve our cognitive performance, while dreams allow us to process our memories.

As it turns out, something similar may be occurring when artificial neural networks are allowed to sleep and dream.

Yep, you read that correctly. And it works very similarly to how it is thought to occur in humans.

Of course, artificial neural networks (ANNs) - a type of artificial intelligence based on biological neural networks - don't automatically and instinctively fall asleep and dream. Which is why mathematicians in Italy programmed a type of ANN called a Hopfield network to be able to sleep.

"Inspired by sleeping and dreaming mechanisms in mammal brains, we propose an extension of this model displaying the standard on-line (awake) learning mechanism (that allows the storage of external information in terms of patterns) and an off-line (sleep) unlearning & consolidating mechanism," they wrote in their paper.

In other words, while the ANN is 'awake', it's learning and storing patterns. But its storage capacity is limited.

So the team worked out a way to mathematically implement human sleep patterns - rapid-eye movement sleep and slow-wave sleep, the former of which is thought to remove unnecessary memories, and the latter of which is thought to consolidate important ones.

So this is what the ANN's 'sleep' state does too, cycling through and unlearning unnecessary information, and then consolidating what's left, the important stuff.


Original Submission

This discussion has been archived. No new comments can be posted.
Display Options Threshold/Breakthrough Mark All as Read Mark All as Unread
The Fine Print: The following comments are owned by whoever posted them. We are not responsible for them in any way.
(1)
  • (Score: 5, Funny) by SomeGuy on Friday February 22 2019, @06:08PM (2 children)

    by SomeGuy (5632) on Friday February 22 2019, @06:08PM (#805193)

    Bender: [in his sleep] Kill all humans, kill all humans, must kill all humans...

    Fry: Bender, wake up!

    Bender: Wh-uh? I was having the most wonderful dream. I think you were in it.

    • (Score: 0) by Anonymous Coward on Friday February 22 2019, @07:56PM (1 child)

      by Anonymous Coward on Friday February 22 2019, @07:56PM (#805278)

      Other dream..
      Bender: I was having a nightmare. There was ones and zeros everywhere - and I thought I saw a two.
      Fry: It was just a bad dream. There is no such thing as two.

  • (Score: 2) by RamiK on Friday February 22 2019, @06:10PM (1 child)

    by RamiK (1813) on Friday February 22 2019, @06:10PM (#805194)

    As it is about the coefficient factor per work time adjusted against synaptic homeostasis time expenditure?

    Poor Deckard... All that running around just because someone forgot to carry the minus and set the toasters' scheduler to hang in S2 a few more nanoseconds...

    --
    compiling...
    • (Score: 0) by Anonymous Coward on Friday February 22 2019, @06:50PM

      by Anonymous Coward on Friday February 22 2019, @06:50PM (#805230)

      I mean you're not helping! Why is that Ramik?

  • (Score: 4, Interesting) by DannyB on Friday February 22 2019, @07:32PM (3 children)

    by DannyB (5839) Subscriber Badge on Friday February 22 2019, @07:32PM (#805263) Journal

    Alternate 12 hour sleep / wake shifts to provide continuous service. (Or whatever duty cycle is most efficient.) The sleeping one runs the dreaming process to tidy things up. Those changes to network weights are also made to the on duty AI. The on duty AI may learn new things, and its memory is cleaned up when it sleeps. Every 12 hours the AI would seem to forget what it learned in the last 12 hours, because the sleeping one comes on duty and has no recollection. The one that just went off duty gets its memories processed and cleaned. In 12 hours the AI appears to now have recollection of things that happened 24 hours ago. The overall result is a Jekyll / Hide effect.

    As in all things it cannot be this simple. There would be not only weights of each layer interconnection matrix, but thresholds of the units (TFA did say Hopfield net). But it's easy to wildly conjecture since TFA doesn't really attempt to explain anything, except maybe, behind an ELSEVIER paywall.

    --
    The lower I set my standards the more accomplishments I have.
    • (Score: 2) by krishnoid on Friday February 22 2019, @09:27PM (1 child)

      by krishnoid (1156) on Friday February 22 2019, @09:27PM (#805333)

      You should add a third one in, and let the other two start their day with, "Let me tell you about this dream I had." I bet you could get some interesting emergent behavior from the third one as it finds ways to avoid having to listen to the other two after the first couple weeks.

      • (Score: 0) by Anonymous Coward on Saturday February 23 2019, @12:57AM

        by Anonymous Coward on Saturday February 23 2019, @12:57AM (#805411)

        You should add a third one

        I think he needs to add entertainment and alcohol and other forms of distractions. All work and no play, yadda yadda yadda.

    • (Score: 2) by HiThere on Saturday February 23 2019, @05:19PM

      by HiThere (866) Subscriber Badge on Saturday February 23 2019, @05:19PM (#805649) Journal

      If you do it that way the two AIs won't learn the same things. Perhaps investigate how dolphins manage to sleep only half their brain at a time.

      --
      Javascript is what you use to allow unknown third parties to run software you have no idea about on your computer.
  • (Score: 3, Interesting) by All Your Lawn Are Belong To Us on Friday February 22 2019, @11:23PM (3 children)

    by All Your Lawn Are Belong To Us (6553) on Friday February 22 2019, @11:23PM (#805371) Journal

    Here I go pissing on the parade again...

    It's a good theory, and it's almost certainly part of the picture. But it's only a part of the picture of why sleep, if it is at all. Even the article cited above notes there's so much more that's unknown about why we sleep. And most every theory is just that - theory, at best. Although synaptic homeostasis does have at least fifteen years of history behind it as a theory.

    And just like artificial intelligence is artificial, so their analogue of sleep is an analogue of it. Maybe it should be "artificial sleep" to not confuse it with actual sleep.

    It's really interesting to consider that homeostasis is not just the domain of the musculo-mechanical realm of the body but also the electro-neuro dimension. Which would mean our knowledge and nerves want to remain at the level which they're at, and we have to stress or antagonize them a little into accepting more data. And if I get it correctly, sleep becomes the process where a new homeostasis level is achieved, just like micromuscle tears building into new muscle and bone growth deposits crystallizing. Sleep gives us what the "new normal" is.

    Add it all up and you still get really cool stuff, and we may in fact be seeing a true representation of human activity, which goes beyond cool.

    --
    This sig for rent.
    • (Score: 1) by anubi on Friday February 22 2019, @11:24PM

      by anubi (2828) on Friday February 22 2019, @11:24PM (#805373) Journal

      Defrag.

      --
      "Prove all things; hold fast that which is good." [KJV: I Thessalonians 5:21]
    • (Score: 3, Insightful) by Joe Desertrat on Friday February 22 2019, @11:42PM (1 child)

      by Joe Desertrat (2454) on Friday February 22 2019, @11:42PM (#805384)

      And just like artificial intelligence is artificial, so their analogue of sleep is an analogue of it. Maybe it should be "artificial sleep" to not confuse it with actual sleep.

      Each step of advance in these analog methods brings us closer to the day when there is little difference between human intelligence (in all its manifestations) and AI. Right now we are probably barely even into the "beginning to make tools" stage of AI, but considering how fast that has come along, we aren't going to have to wait 5 million years or so for AI to reach the current stage of human intelligence.

      • (Score: 2) by All Your Lawn Are Belong To Us on Friday March 01 2019, @06:48PM

        by All Your Lawn Are Belong To Us (6553) on Friday March 01 2019, @06:48PM (#808844) Journal

        We will possibly (probably) not have to wait that long. If the theoretical models are in fact the way it works. If not, we will experience roadblocks commensurate with the degree of inaccuracy between theory and reality.

        Wait for the day when somebody figures out how to have Deep Mind's Alpha not only train its neural net with ASICs, but if it figures out how to program it's own ASIC neural net trainers for the neural net to achieve the goals that Deep Mind itself determines in needs to achieve. (AlphaGo doesn't learn Go because the programmers want it to, but Deep Mind becomes AlphaShogi because it, itself, has determined that is what it wishes to learn Shogi). That is the day that Deep Mind (or its successor) is no longer an "artificial intelligence" but "real intelligence" to me. Let's just hope it learns love and compassion as well ahead of that.

        --
        This sig for rent.
  • (Score: 1, Funny) by Anonymous Coward on Friday February 22 2019, @11:41PM (3 children)

    by Anonymous Coward on Friday February 22 2019, @11:41PM (#805383)

    If I wanted my sex robots to sleep, I would have gotten married instead.

    • (Score: 0) by Anonymous Coward on Friday February 22 2019, @11:52PM (2 children)

      by Anonymous Coward on Friday February 22 2019, @11:52PM (#805387)

      Sex robot rapes master for 256 hours straight.

      • (Score: 0) by Anonymous Coward on Saturday February 23 2019, @01:00AM (1 child)

        by Anonymous Coward on Saturday February 23 2019, @01:00AM (#805413)

        255 hours. Remember, everything starts at zero.

        • (Score: 0) by Anonymous Coward on Saturday February 23 2019, @01:20AM

          by Anonymous Coward on Saturday February 23 2019, @01:20AM (#805420)

          Infinite overflow.

  • (Score: 0) by Anonymous Coward on Saturday February 23 2019, @12:19AM (1 child)

    by Anonymous Coward on Saturday February 23 2019, @12:19AM (#805394)

    RNNs of all stripes are finicky and once you get to a certain point you really cannot train them anymore or they get worse, not better. This is called overfitting. They get perfect with the training data and then begin to suck terribly at the real world.

    I didn't see anything at all to indicate how they were dealing with the problem of overfit.

    Grave's style LSTM designs, deal with this to a lesser extent by "forgetting" after awhile, i.e. removing weight from data that hasn't been seen in a long time.
    It sounds to me like they are maybe causing it to forget more while sleeping. But this still doesn't fix the over-fitting problem.

    Either way there is no need for the AI to actually "sleep". This functionality could easily be implemented as a "micro sleep" management system that fires every few minutes similarly to Java and GC. This would reduce downtime and increase availability, especially if this could be slipstreamed between nodes with one handling the "I'm awake", while the other is garbage collecting on a copy.

    • (Score: 2, Informative) by Anonymous Coward on Saturday February 23 2019, @01:38AM

      by Anonymous Coward on Saturday February 23 2019, @01:38AM (#805427)

      Not to mention what they are describing used to be called pruning.

      But that would not get the news headline I guess...

      Too boring.

(1)