Slash Boxes

SoylentNews is people

posted by LaminatorX on Saturday March 01 2014, @12:30PM   Printer-friendly
from the Call-me-once-you've-quantified-'love' dept.

AnonTechie writes:

"Can a Computer Fall in Love if It Doesn't Have a Body? Much has been written about Spike Jonze's Her, the Oscar-nominated tale of love between man and operating system. It's an allegory about relationships in a digital age, a Rorschach test for technology. It's also premised on a particular vision of artificial intelligence as capable of experiencing love.

Poetic license aside, is that really possible ?"

This discussion has been archived. No new comments can be posted.
Display Options Threshold/Breakthrough Mark All as Read Mark All as Unread
The Fine Print: The following comments are owned by whoever posted them. We are not responsible for them in any way.
  • (Score: 1) by SlimmPickens on Sunday March 02 2014, @12:17AM

    by SlimmPickens (1056) on Sunday March 02 2014, @12:17AM (#9289)

    The emotional structure of the program needs to be in place BEFORE full sentience appears.

    I think there will be a very wide variety of ways bootstrap a mind, and in thirty years, regardless of what laws exist, people will have enough computing power on their desk for a lot of unethical experiments that no-one else need know about.

    In mammals emotions are mediated through chemicals as well as neural firings, but I can see no reason why this should be a requirement. Mirror neuron equivalents, however, may be. Or possibly the structure could be equivalenced at a higher level of abstraction.

    "Neural firings" being a chemical phenomenon aside, I think that basically everyone in AGI agrees with this, and would even lean on the side of algorithmic equivalents. Demis Hassabis (the guy Google just paid £400m for Deep Mind) can be found on Youtube advocating what he calls something like the "middle road algorithmic approach, guided where possible by neuroscience" and Ben Goertzel essentially does that, even if he hasn't had a great deal of neuroscience to guide him in the many years he's been creating AGI.

  • (Score: 1) by HiThere on Monday March 03 2014, @12:59AM

    by HiThere (866) Subscriber Badge on Monday March 03 2014, @12:59AM (#9846) Journal

    I think you misunderstand my proposal. I'm proposing that neural firings, and even neurons, is the wrong level to model. That you need to model what they are doing. Rather like compilers changing the same code to map to different processor designs. The assembler level code may be very different when produced to run on two different CPUs...particularly ones with very different abstractions that they, in turn, turn into bit manipulations (down at the bit manipulation level, where half-adders, etc. work). And, of course, it's even possible to not have the base level implemented in terms of bits. (In the early 1950's there was a computer that worked in base 10 at the lowest level...i.e., storing 10 different voltage levels in the same cell.)

    So you can model things at lots of different levels and achieve approximately the same results, and I suspect that the neuron level is too low a level to choose to model when you're building an AI.

    Javascript is what you use to allow unknown third parties to run software you have no idea about on your computer.