Stories
Slash Boxes
Comments

SoylentNews is people

posted by LaminatorX on Saturday March 01 2014, @12:30PM   Printer-friendly
from the Call-me-once-you've-quantified-'love' dept.

AnonTechie writes:

"Can a Computer Fall in Love if It Doesn't Have a Body? Much has been written about Spike Jonze's Her, the Oscar-nominated tale of love between man and operating system. It's an allegory about relationships in a digital age, a Rorschach test for technology. It's also premised on a particular vision of artificial intelligence as capable of experiencing love.

Poetic license aside, is that really possible ?"

This discussion has been archived. No new comments can be posted.
Display Options Threshold/Breakthrough Mark All as Read Mark All as Unread
The Fine Print: The following comments are owned by whoever posted them. We are not responsible for them in any way.
  • (Score: 1, Funny) by Debvgger on Saturday March 01 2014, @12:47PM

    by Debvgger (545) on Saturday March 01 2014, @12:47PM (#9080)

    But with no body it certainly couldn't love itself :-)

    • (Score: 2, Informative) by Mr_Flibble on Saturday March 01 2014, @03:38PM

      by Mr_Flibble (286) on Saturday March 01 2014, @03:38PM (#9139)

      I thought that was what the loopback interface was for.

      --
      Just because I suffer from paranoia doesn't mean people aren't out to get me.
      • (Score: 0) by Anonymous Coward on Sunday March 02 2014, @02:03AM

        by Anonymous Coward on Sunday March 02 2014, @02:03AM (#9341)

        Like this [ebaumsworld.com]?? :D

  • (Score: 4, Informative) by bucc5062 on Saturday March 01 2014, @12:47PM

    by bucc5062 (699) on Saturday March 01 2014, @12:47PM (#9081)

    Now really guys, I thought we talked about this before. I'm going to go with, "Are they paying attention?" And since that questions NOT a headline I will answer yes.

    Betteridge's Law applied...No.

    Now, It's off to the corner Mr/Ms Editor for not only did you present a story in a manor more commiserate with ... well ... you know, but it is really a dumb one as well. A rather enjoyable discussion was held on the topic of AI and when that will happen and Love, as I understand it, will need to come after, not before , sentience.

    No...No...and one more time...No.

    --
    The more things change, the more they look the same
    • (Score: 1) by HiThere on Saturday March 01 2014, @05:02PM

      by HiThere (866) Subscriber Badge on Saturday March 01 2014, @05:02PM (#9161) Journal

      I've *got* to disagree. The emotional structure of the program needs to be in place BEFORE full sentience appears. Afterwards it would be immoral to try to coerce the change...and probably lead to a justifiable revolt.

      Of course, this sort of depends on what you are willing to call sentience. A dog is capable of love. So is a cow (though they rarely love people). I'm not sure about a mouse...I haven't gone looking for evidence. My suspicion is that it is. I believe, with insufficient evidence, that the capability for love is inherent in the basic mammalian brain structure. I suspect that birds, also, are capable of love, but that's less certain (and partially depends on your precise definition).

      In mammals emotions are mediated through chemicals as well as neural firings, but I can see no reason why this should be a requirement. Mirror neuron equivalents, however, may be. Or possibly the structure could be equivalenced at a higher level of abstraction.

      N.B.: Merely having emotions doesn't imply that the emotions will be cognate to those of humans, or of any other animal. But it *is* a requirement. (I suspect that they will generally be quite different unless careful work is done to make them cognate...and that the work will be necessary, because otherwise people won't understand them.)

      --
      Javascript is what you use to allow unknown third parties to run software you have no idea about on your computer.
      • (Score: 1) by SlimmPickens on Sunday March 02 2014, @12:17AM

        by SlimmPickens (1056) on Sunday March 02 2014, @12:17AM (#9289)

        The emotional structure of the program needs to be in place BEFORE full sentience appears.

        I think there will be a very wide variety of ways bootstrap a mind, and in thirty years, regardless of what laws exist, people will have enough computing power on their desk for a lot of unethical experiments that no-one else need know about.

        In mammals emotions are mediated through chemicals as well as neural firings, but I can see no reason why this should be a requirement. Mirror neuron equivalents, however, may be. Or possibly the structure could be equivalenced at a higher level of abstraction.

        "Neural firings" being a chemical phenomenon aside, I think that basically everyone in AGI agrees with this, and would even lean on the side of algorithmic equivalents. Demis Hassabis (the guy Google just paid £400m for Deep Mind) can be found on Youtube advocating what he calls something like the "middle road algorithmic approach, guided where possible by neuroscience" and Ben Goertzel essentially does that, even if he hasn't had a great deal of neuroscience to guide him in the many years he's been creating AGI.

        • (Score: 1) by HiThere on Monday March 03 2014, @12:59AM

          by HiThere (866) Subscriber Badge on Monday March 03 2014, @12:59AM (#9846) Journal

          I think you misunderstand my proposal. I'm proposing that neural firings, and even neurons, is the wrong level to model. That you need to model what they are doing. Rather like compilers changing the same code to map to different processor designs. The assembler level code may be very different when produced to run on two different CPUs...particularly ones with very different abstractions that they, in turn, turn into bit manipulations (down at the bit manipulation level, where half-adders, etc. work). And, of course, it's even possible to not have the base level implemented in terms of bits. (In the early 1950's there was a computer that worked in base 10 at the lowest level...i.e., storing 10 different voltage levels in the same cell.)

          So you can model things at lots of different levels and achieve approximately the same results, and I suspect that the neuron level is too low a level to choose to model when you're building an AI.

          --
          Javascript is what you use to allow unknown third parties to run software you have no idea about on your computer.
      • (Score: 2) by SMI on Sunday March 02 2014, @08:56AM

        by SMI (333) on Sunday March 02 2014, @08:56AM (#9460)

        THIS is why I enjoy SoylentNews and this community so much. I'm inclined to believe that pretty much anything could be presented, and ridiculously good debate will ensue. Thanks guys. :)

    • (Score: 3, Informative) by mattie_p on Saturday March 01 2014, @08:34PM

      by mattie_p (13) on Saturday March 01 2014, @08:34PM (#9218) Journal

      We pay attention. I don't think we're deliberately toying with you. One of the things the editors have worked out is the principle of minimum changes. We can make any changes we want and still have it say that the submitter wrote it. We don't want to do that. We would much rather post something that changes as little as possible about the submission, even if it isn't perfect.

      We're still striking that balance, as you can see. But, for now, I'd rather err on the side of caution, and leave the words and flow of thought intact, than to make changes that abuse our privileges within the system.

      Thanks for reading, and keep up the constructive criticism! ~mattie_p

      • (Score: 2) by SMI on Sunday March 02 2014, @08:58AM

        by SMI (333) on Sunday March 02 2014, @08:58AM (#9462)

        This is very informative, thank you mattie_p. Keep up the good work!

  • (Score: 3, Insightful) by nobbis on Saturday March 01 2014, @12:49PM

    by nobbis (62) on Saturday March 01 2014, @12:49PM (#9082) Homepage Journal

    Betteridge's [wikipedia.org] law applies

    --
    It's easy to look up when your mind's in the gutter
    • (Score: 1) by SlimmPickens on Sunday March 02 2014, @01:31AM

      by SlimmPickens (1056) on Sunday March 02 2014, @01:31AM (#9325)

      Betteridge's law is stupid. This question is essentially "will we create AGI" and the answer to that is anything but "No".

  • (Score: 3, Interesting) by RobotMonster on Saturday March 01 2014, @01:23PM

    by RobotMonster (130) on Saturday March 01 2014, @01:23PM (#9092) Journal

    Betteridge aside, there are already simulations of entire insect minds.
    Eventually we will have enough computing power and expertise to create an accurate simulations of entire humans.
    Why would these simulations be immune from love?

    Virtual pheromones expressed by one simulacrum's virtual body smelt by another's virtual nose would have the same effects as they do in the real world, as would any other type of virtual interaction. Connect the simulations to real-world robots for real-world interaction as well, and I'm sure love will blossom in unexpected ways.

    We are a long way off from doing any of this; I'm waiting to see the first AI that can compete against a Crow at solving novel problems.
     

    • (Score: 3, Insightful) by mcgrew on Saturday March 01 2014, @02:07PM

      by mcgrew (701) <publish@mcgrewbooks.com> on Saturday March 01 2014, @02:07PM (#9110) Homepage Journal

      A simulation of a brain will produce real thoughts and emotions like a simulation of an atomic bomb produces real radiation and fallout.

      A simulation describes reality to the limits of its processor, memory, and storage. A simulation doesn't create reality, it's simply a description.

      --
      mcgrewbooks.com mcgrew.info nooze.org
      • (Score: 0, Redundant) by sar on Saturday March 01 2014, @02:31PM

        by sar (507) on Saturday March 01 2014, @02:31PM (#9118)

        Exactly

      • (Score: 4, Informative) by RobotMonster on Saturday March 01 2014, @02:31PM

        by RobotMonster (130) on Saturday March 01 2014, @02:31PM (#9119) Journal

        It'll seem real enough to the simulation. The universe we inhabit, for all we know, is running on a Raspberry PI in the real world.

        If you're interested in this philosophical debate, I can highly recommend reading Greg Egan's Permutation City [wikipedia.org].

      • (Score: 3, Insightful) by geb on Saturday March 01 2014, @02:32PM

        by geb (529) on Saturday March 01 2014, @02:32PM (#9120)

        Love is not a physical product. It might be associated with neurochemistry and electrical signals in the brain, but neurochemicals are not what people mean when they say "love". It is the behaviour that matters.

        If we suppose, for the sake of argument, that there were a computer powerful enough to simulate a human brain right down to the movements of individual atoms, how is the behaviour generated in that system less meaningful than that of a physical brain?

        Unless you want to start arguing for some kind of cartesian dualism, there's no difference in the process.

        • (Score: 2) by RobotMonster on Saturday March 01 2014, @02:37PM

          by RobotMonster (130) on Saturday March 01 2014, @02:37PM (#9123) Journal

          Well said, thank you!

        • (Score: 4, Insightful) by mcgrew on Saturday March 01 2014, @04:03PM

          by mcgrew (701) <publish@mcgrewbooks.com> on Saturday March 01 2014, @04:03PM (#9143) Homepage Journal

          ...neurochemicals are not what people mean when they say "love".

          Neurochemicals are not what people mean when they say "hate" or "pain", either. But hate, pain, love, are feelings. Love isn't how you act towards someone, it's how you feel about someone.

          Sociopaths are incapable of love, but they are incredibly good at faking it. The behavior can be faked, the feeling cannot. And the feeling is nothing but chemistry, same as thought itself.

          --
          mcgrewbooks.com mcgrew.info nooze.org
          • (Score: 3, Interesting) by geb on Saturday March 01 2014, @04:59PM

            by geb (529) on Saturday March 01 2014, @04:59PM (#9160)

            I wasn't trying to argue that feelings are nothing more than the output signals generated to move a body. You're right that calling it behaviour was poor wording on my part.

            Perhaps a better phrase would be that the activity of the mind is what people mean when they talk about emotions.

            The mind is the important bit. It would be silly to say that a jar full of synthetic dopamine represents perfect happiness/comfort/arousal/whatever. Similarly, if you pump a load of dopamine into a brain where all the dopamine receptors have been damaged, that shouldn't count as happiness either. The chemical might be there, but the activity in the mind isn't.

            What matters is fitting the right trigger into the right receptor to adjust the running of the mind, and that statement works the same way whether it's a chemical trigger, an electrical signal, or a bit flipped in a simulation.

            • (Score: 4, Interesting) by Ethanol-fueled on Saturday March 01 2014, @05:54PM

              by Ethanol-fueled (2792) on Saturday March 01 2014, @05:54PM (#9172) Homepage

              What makes the movie seem creepy and pathetic ( Disclaimer: I saw the preview trailer but not the whole movie ) is the ever-present nagging fact that real "chemical attraction" requires chemicals. Not only the chemicals within your brains, but natural chemicals like pheromones and skin/hair oils as well as unnatural scents like perfume and conditioner -- and we're not even talking other associated stimuli like pleasant dinners and whatnot.

              I remember being in Basic Training outside in formation, having not been laid or even having jacked off in weeks, and what drove us males crazy was not the sight or the sounds of female voices, but the smell of their perfume and conditioner which hit us hard, hastened our heartbeats and made us shift in our boots even at attention.

  • (Score: -1, Offtopic) by MichaelDavidCrawford on Saturday March 01 2014, @01:32PM

    by MichaelDavidCrawford (2339) Subscriber Badge <mdcrawford@gmail.com> on Saturday March 01 2014, @01:32PM (#9097) Homepage Journal

    It is not yet in the App Store yet. It's almost done, but someone lifted my iPhone, I hocked my iPad when I was busted flat, and now I don't have the cash for the Apple Tax. However I just scored a real good contract, so development has already resume in the Simulator, with testing on real hardware resuming when I get my first check a little over a week from now.

    If you'd like to beta, send your UDID to mdcrawford@gmail.com [mailto] If you don't know how to fetch your iDevice's UDID, follow these instructions [warplife.com].

    So anyway...

    Few remember anymore that Conway's Game of Life [conwaylife.com] was originally developed by Princeton Mathematician John Horton Conway as an Artificial Intelligence Research Tool in 1970. Back then I used to play it on a checkerboard.

    Sometime around 1980, Conway proved that there is a Turing-Complete Computer somewhere in the Life Universe, but he didn't produce the actual computer. I don't have a clue how he managed to pull that one off. So sometime later, TWO DIFFERENT such computers were designed.

    So I tell those who are unfamiliar with computers, but who own iDevices, that someday, Warp Life will run on an iPad 42. It will fall into a purely artificial love with another instance of Warp Life, or Golly, or another of my competitors. It will court it's purely artificial heart's desire, their joyous friends, family and loved ones will join them to celebrate a purely artificial wedding, someday they will celebrate the birth of their very first purely artificial child.

    Perhaps, one day, Warp Life will worship a purely artificial G-d.

    For this to happen, we need research into the algorithms. That's what most folks at http://conwaylife.com/ [conwaylife.com] do. I can't fathom how they can come up with those amazing new patterns, what I prefer to refer to as "Animals".

    We also need bigger, faster computers, for example what the San Diego Supercomputer Center is doing. They recently - well a year or so ago - announced a new superbox that uses mostly Flash for memory, rather than dynamic RAM, so it uses far less power.

    And someone is going to need to make Life itself run much faster, however fast the speed of the box it's running on. That is my part, with Warp Life.

    I haven't compared them yet but Golly may be faster. However Golly does not run on the iPhone, just the iPad. Warp Life runs on the iPhone, iPad and iPod Touch.

    I agonized over what to charge for Warp Life in the App Store, at some point settling on $4.99. But no, in the end I decided there is a better way. It will be free, there will be a source code tarball included in the Application Bundle that you can transfer to your desktop with iTunes, and that code will have a Free Software license.

    (Not an Open Source license. Do you understand the difference?)

    --
    Yes I Have No Bananas. [gofundme.com]
    • (Score: 2) by SMI on Sunday March 02 2014, @09:08AM

      by SMI (333) on Sunday March 02 2014, @09:08AM (#9467)

      I don't think that this post is off topic.

  • (Score: 2, Insightful) by bd on Saturday March 01 2014, @01:33PM

    by bd (2773) on Saturday March 01 2014, @01:33PM (#9098)

    Sorry, I can't really take this article seriously. Maybe it should not have made it to the front page... just saying. I'm not angry or anything as the general quality of the editing has been very good. In this one case the summary and headline are a blatant copy of the first paragraph and title of the wired article. I hated it when this was done on /., the summary is supposed to summarize the article, the headline should not be "journalistic".

    As to the content of the linked article: "Random scientist asked about whether a movie depicts his field of research in a realistic manner". Answer: "No".

    And now we should discuss whether he is right...

  • (Score: 2, Insightful) by NewMexicoArt on Saturday March 01 2014, @01:34PM

    by NewMexicoArt (1369) on Saturday March 01 2014, @01:34PM (#9100)

    can a computer fall in love?

    depends on how you define love.

    if you can come up with a definition that everyone agrees with, you will not only answer this question but have a best seller book.

    • (Score: 1) by Wakaranai on Saturday March 01 2014, @03:16PM

      by Wakaranai (486) on Saturday March 01 2014, @03:16PM (#9131)

      I recall Haddaway said "What is love? Baby, don't hurt me. Don't hurt me no more"

      Not sure if that helps...

  • (Score: 3, Funny) by mcgrew on Saturday March 01 2014, @02:03PM

    by mcgrew (701) <publish@mcgrewbooks.com> on Saturday March 01 2014, @02:03PM (#9107) Homepage Journal

    Only from the Sirius Cybernetics Corporation and their Genuine People Personalities.

    Life -- hate it or loathe it, you can't ignore it. And I have a terrible paind down the diodes in my left leg...

    --
    mcgrewbooks.com mcgrew.info nooze.org
  • (Score: 3, Interesting) by pixeldyne on Saturday March 01 2014, @02:03PM

    by pixeldyne (2637) on Saturday March 01 2014, @02:03PM (#9108)

    many computers hate me

  • (Score: 1, Interesting) by MichaelDavidCrawford on Saturday March 01 2014, @02:08PM

    by MichaelDavidCrawford (2339) Subscriber Badge <mdcrawford@gmail.com> on Saturday March 01 2014, @02:08PM (#9111) Homepage Journal

    "Idoru" is Japanese for "Idol Singer", that is, a singer who might have some technical ability but maybe no real talent, who is backed by the entertainment companies just to make a buck - or rather a Yen - off the young people.

    I understand that The Monkees were that way back in the sixties.

    So in Gibson's book, they come up with an artificially-intelligent Idoru who has no physical body but who is a holographic projection.

    The lead character is a teenage girl who is a member of a fan club for some American male singer. They grow concerned when this guy falls in love with the Idoru. He travels to Japan to court her. His American fans grow concerned that his sanity is at risk.

    My favorite scene in the book is when this teenage girl and her Japanese friend need to get online, but in a clandestine way, so they rent a very expensive room at a fastidiously discreet "Love Motel". But then her friend realizes that someone is tracing down their location. It turns out that the teenage girl charged a bottle of water from the room refrigerator to her credit card.

    Fast forward to today, and I don't use grocery store loyalty cards anymore, even though they would save me a lot of money on my groceries.

    When I first read of Edward Snowden, he was holed up in a fancy hotel - was it hong kong? - charging room service to his credit card while trying to hide from the NSA. What a damn fool, it was probably a clandestine agent delivering his morning coffee.

    --
    Yes I Have No Bananas. [gofundme.com]
    • (Score: 1) by MichaelDavidCrawford on Saturday March 01 2014, @02:35PM

      by MichaelDavidCrawford (2339) Subscriber Badge <mdcrawford@gmail.com> on Saturday March 01 2014, @02:35PM (#9122) Homepage Journal

      In a Soylentnews discussion of computers falling in love, I discuss a novel about a human who falls in love with an artificial intelligence.

      I am further stymied as to how so many - what shall we call each other? Soylentbots just doesn't roll of the tongue - fellow Soylentnews members are complaining that this entire article is off-topic.

      What could be more on-topic for a bunch of propellerheads than a discussion of artificial intelligences being made to feel human emotion?

      Sure winning at Jeopardy is cool, as is become the world chess champion, but really both amount to exhaustive search. There's no real finesse in that just a lot of transistors and investment.

      Love is not well-understood to this day. It's going to be a long time before it is simulated in an AI.

      --
      Yes I Have No Bananas. [gofundme.com]
      • (Score: 2) by bucc5062 on Saturday March 01 2014, @03:58PM

        by bucc5062 (699) on Saturday March 01 2014, @03:58PM (#9142)

        I read your post and though it was interesting in one way, it did not really touch on the topic as presented. The thought is "can a computer love" and your story is more like "the man falls in love with a computer". Also, you kind of skip around on topic between the story and this leap to Snowden and hiding from the Government spies. I see the connection between the story and reality, but not between the story and the topic....thus off-topic.

        However, I now am going to look up that book and if I like it, read it. Sounds interesting....but is off topic (like my response in turn). HAGD

        --
        The more things change, the more they look the same
      • (Score: 1) by SlimmPickens on Sunday March 02 2014, @01:09AM

        by SlimmPickens (1056) on Sunday March 02 2014, @01:09AM (#9311)

        what shall we call each other? Soylentbots just doesn't roll of the tongue

        "Soylentils" seems to be gaining momentum, but I am of the opinion that when we deny service with our traffic we'll have "Soyled" the web server and therefore we are "Soylers".

    • (Score: -1, Offtopic) by Anonymous Coward on Saturday March 01 2014, @04:53PM

      by Anonymous Coward on Saturday March 01 2014, @04:53PM (#9158)

      http://en.wikipedia.org/wiki/Hatsune_Miku [wikipedia.org]

      In October 2011, Crypton showed on the official Hatsune Miku Facebook page a letter from the Japanese Minister of Economy for "contributing to the furtherance of the informatization by minister of economy."

      (mod post offtopic)

    • (Score: 2) by SMI on Sunday March 02 2014, @09:17AM

      by SMI (333) on Sunday March 02 2014, @09:17AM (#9473)

      "Fast forward to today, and I don't use grocery store loyalty cards anymore..."

      As an OT aside, if you know someone's phone number whom you aren't too fond of, try using that phone number in place of the physical loyalty card. ;)

  • (Score: 1) by mechanicjay on Saturday March 01 2014, @02:22PM

    by mechanicjay (7) <mechanicjayNO@SPAMsoylentnews.org> on Saturday March 01 2014, @02:22PM (#9115) Homepage Journal
    I suppose with a sophisticated enough ai, anything is possible. That aside, its interesting to me how the same machine will react differently to the same inputs from different people. Sometimes we call these machines temperamental, or having character. After some time in the world, the ly develop a distinct personality, even though it was one of an identical production run. I have a car, inherited from my grandpa, which I've put something like 50k trouble free miles on. I can't let my dad use it because *every* time he drives it, it'll leave him stranded. I take cross county road trips in the thing, but it won't even carry him to the grocery store. If a purely mechanical beast can have a personal likes and dislikes, I think that a complex layer of software should be able to enhance this. Of course, is this love like a person, or loyalty like dog? Perhaps that's the real question as the former is a much more complex thing.
    --
    My VMS box beat up your Windows box.
    • (Score: 2) by SMI on Sunday March 02 2014, @09:31AM

      by SMI (333) on Sunday March 02 2014, @09:31AM (#9480)

      "Of course, is this love like a person, or loyalty like dog? Perhaps that's the real question as the former is a much more complex thing."

      I found your comment very interesting, but I have to take umbrage with those last two sentences. I think your question is far more subjective than you may realize. Don't get me wrong, while I may not agree with your opinion of dogs, I strongly believe in the individualized ability to choose for oneself, and I do agree that love is a more complex concept than loyalty. I simply refuse to take it as a fact that dogs aren't capable of love (only loyalty) and that people are somehow inherently better than other animals.

    • (Score: 1) by blackest_k on Sunday March 02 2014, @11:17AM

      by blackest_k (2045) on Sunday March 02 2014, @11:17AM (#9515)

      That can be mechanical sympathy, if your tuned into a particular vehicle you know when your pushing it hard when it sounds off. How to operate it smoothly how to brake how to change gear smoothly. Your dad probably isn't in tune with that car and doesn't know how to treat it right with his past experience there is probably going to be negativity right from opening the door.

      With computers you generally have an idea of how hard it's working you don't push so much work on to it so its running out of ram you probably killed all those extra processes that got pushed onto the system when you installed some software.

      The same spec system with someone who can barely use it will have a much worse time of it.
      It's like my internet connection very poor at handling more than one or two jobs at a time. If I was to say open 3 or 4 tabs at the same time chances are all four would fail to load if on the other hand I allow a tab to complete then load another then there isn't a problem. sometimes I need to reboot the router if I've been working it too hard.

      It might be described as personality, after all in my interactions I know the quirks and how to get the best out of a system. I guess it's a bit like driving a strange car first time you drive it, you don't know how it responds brakes are normally quite different between cars some require a long and hard push others you barely touch them and they try to face plant you on the windscreen. First time you'll be bad with a new (to you) car after a couple of drives you will have a feel for it.

         

  • (Score: 2) by martyb on Saturday March 01 2014, @02:23PM

    by martyb (76) Subscriber Badge on Saturday March 01 2014, @02:23PM (#9116) Journal

    I posit that the story title conflates several aspects and suggest the following for consideration:

    1. A person can love a computer.
    2. A computer can love a person.
    3. A computer and a person can have a mutual, loving relationship.

    As I see it, those are the primary topics of discussion, but I may have missed something as I am in a hurry; please feel free to augment and/or discuss below.

    --
    Wit is intellect, dancing.
    • (Score: 1) by Debvgger on Saturday March 01 2014, @04:07PM

      by Debvgger (545) on Saturday March 01 2014, @04:07PM (#9145)

      I love Commodore Amiga computers :-)

  • (Score: 5, Insightful) by Jerry Smith on Saturday March 01 2014, @04:12PM

    by Jerry Smith (379) on Saturday March 01 2014, @04:12PM (#9147) Journal

    Can a submarine swim?

    --
    All those moments will be lost in time, like tears in rain. Time to die.
  • (Score: 2, Interesting) by takyon on Saturday March 01 2014, @06:09PM

    by takyon (881) <takyonNO@SPAMsoylentnews.org> on Saturday March 01 2014, @06:09PM (#9181) Journal

    The real question: will we create a "strong" AI?

    Obviously a person can fall in love with a computer and a computer can be programmed to fake emotions and original thoughts. Once (or if) we have "genuine" strong AI, the love question becomes more interesting. For example, if the AI outpaces human intelligence, would we be too inferior to hold a relationship of equals? Maybe you would have to augment your own intelligence to stay relevant.

    --
    [SIG] 10/28/2017: Soylent Upgrade v14 [soylentnews.org]
  • (Score: 1) by amblivious on Saturday March 01 2014, @07:25PM

    by amblivious (26) on Saturday March 01 2014, @07:25PM (#9198)

    Someone introduced me to a series called Black Mirror. Series 2, episode 1, "Be Right Back", covers an analogous situation where a computer simulates a woman's dead husband by using all his online communications. If you've not seen this series before I highly recommend it. Each episode is a independent, thought-provoking short story with insights into the impacts on society from the new technologies we're just starting to come to grips with. A must-see for geeks.