Stories
Slash Boxes
Comments

SoylentNews is people

posted by LaminatorX on Saturday March 01 2014, @12:30PM   Printer-friendly
from the Call-me-once-you've-quantified-'love' dept.

AnonTechie writes:

"Can a Computer Fall in Love if It Doesn't Have a Body? Much has been written about Spike Jonze's Her, the Oscar-nominated tale of love between man and operating system. It's an allegory about relationships in a digital age, a Rorschach test for technology. It's also premised on a particular vision of artificial intelligence as capable of experiencing love.

Poetic license aside, is that really possible ?"

 
This discussion has been archived. No new comments can be posted.
Display Options Threshold/Breakthrough Mark All as Read Mark All as Unread
The Fine Print: The following comments are owned by whoever posted them. We are not responsible for them in any way.
  • (Score: 1) by HiThere on Saturday March 01 2014, @05:02PM

    by HiThere (866) Subscriber Badge on Saturday March 01 2014, @05:02PM (#9161) Journal

    I've *got* to disagree. The emotional structure of the program needs to be in place BEFORE full sentience appears. Afterwards it would be immoral to try to coerce the change...and probably lead to a justifiable revolt.

    Of course, this sort of depends on what you are willing to call sentience. A dog is capable of love. So is a cow (though they rarely love people). I'm not sure about a mouse...I haven't gone looking for evidence. My suspicion is that it is. I believe, with insufficient evidence, that the capability for love is inherent in the basic mammalian brain structure. I suspect that birds, also, are capable of love, but that's less certain (and partially depends on your precise definition).

    In mammals emotions are mediated through chemicals as well as neural firings, but I can see no reason why this should be a requirement. Mirror neuron equivalents, however, may be. Or possibly the structure could be equivalenced at a higher level of abstraction.

    N.B.: Merely having emotions doesn't imply that the emotions will be cognate to those of humans, or of any other animal. But it *is* a requirement. (I suspect that they will generally be quite different unless careful work is done to make them cognate...and that the work will be necessary, because otherwise people won't understand them.)

    --
    Javascript is what you use to allow unknown third parties to run software you have no idea about on your computer.
  • (Score: 1) by SlimmPickens on Sunday March 02 2014, @12:17AM

    by SlimmPickens (1056) on Sunday March 02 2014, @12:17AM (#9289)

    The emotional structure of the program needs to be in place BEFORE full sentience appears.

    I think there will be a very wide variety of ways bootstrap a mind, and in thirty years, regardless of what laws exist, people will have enough computing power on their desk for a lot of unethical experiments that no-one else need know about.

    In mammals emotions are mediated through chemicals as well as neural firings, but I can see no reason why this should be a requirement. Mirror neuron equivalents, however, may be. Or possibly the structure could be equivalenced at a higher level of abstraction.

    "Neural firings" being a chemical phenomenon aside, I think that basically everyone in AGI agrees with this, and would even lean on the side of algorithmic equivalents. Demis Hassabis (the guy Google just paid £400m for Deep Mind) can be found on Youtube advocating what he calls something like the "middle road algorithmic approach, guided where possible by neuroscience" and Ben Goertzel essentially does that, even if he hasn't had a great deal of neuroscience to guide him in the many years he's been creating AGI.

    • (Score: 1) by HiThere on Monday March 03 2014, @12:59AM

      by HiThere (866) Subscriber Badge on Monday March 03 2014, @12:59AM (#9846) Journal

      I think you misunderstand my proposal. I'm proposing that neural firings, and even neurons, is the wrong level to model. That you need to model what they are doing. Rather like compilers changing the same code to map to different processor designs. The assembler level code may be very different when produced to run on two different CPUs...particularly ones with very different abstractions that they, in turn, turn into bit manipulations (down at the bit manipulation level, where half-adders, etc. work). And, of course, it's even possible to not have the base level implemented in terms of bits. (In the early 1950's there was a computer that worked in base 10 at the lowest level...i.e., storing 10 different voltage levels in the same cell.)

      So you can model things at lots of different levels and achieve approximately the same results, and I suspect that the neuron level is too low a level to choose to model when you're building an AI.

      --
      Javascript is what you use to allow unknown third parties to run software you have no idea about on your computer.
  • (Score: 2) by SMI on Sunday March 02 2014, @08:56AM

    by SMI (333) on Sunday March 02 2014, @08:56AM (#9460)

    THIS is why I enjoy SoylentNews and this community so much. I'm inclined to believe that pretty much anything could be presented, and ridiculously good debate will ensue. Thanks guys. :)