Stories
Slash Boxes
Comments

SoylentNews is people

posted by martyb on Wednesday May 22 2019, @12:21PM   Printer-friendly
from the Marvin dept.

No one is yet quite sure how human consciousness comes about, but many seem to assume that it will arise as a function of artificial intelligence. Isn't it just as reasonable to think that emotions will appear as an aspect of consciousness and the presumed will to survive? The answers to these questions have yet to emerge, but during the interim, is it a good idea to push ahead in the development of artificial intelligence when we have such a limited understanding of our own? What about the possibility of mental illness? Even if we succeed in endowing AI with a morality compatible with our own, what would we do with a super human intelligence that becomes delusional, or worse, psychotic? Would we see it coming? We can't prevent it from happening to ourselves, so what makes us think we could prevent it in a machine?

Nervously awaiting learned opinions,
VT


Original Submission

 
This discussion has been archived. No new comments can be posted.
Display Options Threshold/Breakthrough Mark All as Read Mark All as Unread
The Fine Print: The following comments are owned by whoever posted them. We are not responsible for them in any way.
  • (Score: 2) by Immerman on Thursday May 23 2019, @01:40AM

    by Immerman (3985) on Thursday May 23 2019, @01:40AM (#846473)

    >We should probably, initially at least, still require a passenger with a driver's license in order that, should some level of operational parameters for the AI be exceeded, they can be alerted to take over control of the vehicle.

    If they have to do that while the car is moving, I think the systems will be irresponsibly dangerous. It's completely unreasonable to expect a thoroughly distracted passenger to start paying attention to the road, get their bearings in a difficult situation, and then take over driving faster than the vehicle can bring itself to a safe stop.

    Now, if instead of alerting the passenger to take over while driving, it can come to a safe stop, give the passenger time to wake up and figure out what's going on, and then drive themselves away, I'm totally on board. An AI doesn't need to be able to handle anything that comes at it, so long as it recognizes when it has a problem and can avoid making it worse until a human can decide what to do at their leisure. Perfectly reasonable to require a backup driver be available for the rare unexpected problem, so long as they don't have to actually be on duty through countless hours of problem-free driving.

    Starting Score:    1  point
    Karma-Bonus Modifier   +1  

    Total Score:   2