Stories
Slash Boxes
Comments

SoylentNews is people

posted by janrinok on Sunday August 09 2015, @06:53PM   Printer-friendly
from the do-you-remember-your-stopping-distances? dept.

Einstein once said, "Put your hand on a hot stove for a minute, and it seems like an hour. Sit with a pretty girl for an hour and it seems like a minute. THAT'S relativity."

So 5-8 seconds seems like a (relatively) short amount of time. But, is it enough to safely take back control of a self-driving car and negotiate a road hazard? And if the driver is given less time, is it better or worse? Researchers at Stanford attempted to find out:

In this study, we observed how participants (N=27) in a driving simulator performed after they were subjected to an emergency loss of automation. We tested three transition time conditions, with an unstructured transition of vehicle control occurring 2 seconds, 5 seconds, or 8 seconds before the participants encountered a road hazard that required the drivers' intervention.

Few drivers in the 2 second condition were able to safely negotiate the road hazard situation, while the majority of drivers in 5 or 8 second conditions were able to navigate the hazard safely.

Although the participants in the current study were not performing secondary tasks while the car was driving, the 2 second condition appeared to be insufficient. The participants did not perform well and liked the car less. Additionally, participants' comfort in the car was also lower in the 2 second condition. Hence, it is recommended to give warnings or relinquish control more than 2 seconds in advance. While not necessarily the minimum required time, 5 second condition from a critical event appeared to be sufficient for drivers to perform the take over successfully and negotiate the problem. While the results of this study indicated that there was a minimum amount of time needed for transition of control, this was true when the drivers only monitored the car's activity and did not perform secondary tasks. It is possible that these results can change if the drivers are occupied with other activities.

Full research paper available here.


Original Submission

 
This discussion has been archived. No new comments can be posted.
Display Options Threshold/Breakthrough Mark All as Read Mark All as Unread
The Fine Print: The following comments are owned by whoever posted them. We are not responsible for them in any way.
  • (Score: 3, Interesting) by VLM on Monday August 10 2015, @11:31AM

    by VLM (445) Subscriber Badge on Monday August 10 2015, @11:31AM (#220630)

    Anyway, the idea of a driver (or operator of any machinery) having to watch an automated process and constantly make judgements, as opposed to fully controlling it, ready to intervene with a degree of readiness measured in seconds - as a matter of life or death - is absolutely crazy.

    Well, that's an industrial factory line machine operator position. As fewer people are involved in manufacturing and the illegals take over, its no surprise that kind of work is missed from consciousness. Observationally people either freak out and burn out in a week or they get used to the idea of not giving much of a F and last until the next accident, at which time they get fired for not giving a F and get another minimum wage machine operator job at a competitor. Essentially they're being paid minimum wage to be scapegoats for software failures. I suppose there are white collar jobs like that too.

    Anyway the point is that people taking over after a machine fails doesn't really work all that often. There is an interesting related strategy for self driving cars. So last night I drove home, ate dinner, and then I played minecraft for a bit and then go to bed. So rather than driving home at 80 MPH for 20 miles why not drive home at 25 MPH on side streets etc and stay off the dangerous interstate and let the chips fall where they may as I play minecraft and maybe eat a packed dinner in my car, maybe take a nap? Its hard to kill the passengers in a car in a collision "well under 25 MPH" I suppose the most likely cause of death would be the automated car driving off bridges or maybe drunks in hand driven cars running red lights at 100 MPH. Lets say my car (not I) hit a parked car at 25 MPH which would pretty well total both cars but is unlikely to kill or even injure anyone. I can go all machine operator "who cares" and let the black box and insurance company fight Toyota's software engineers all they want, I just don't need to care. The ideal self driving commuter car would probably look a lot more like an enclosed golf cart on a side street than a giant (empty) SUV on the interstate.

    Going further, imagine the roads filled with bumper cars. Sure it takes "awhile" to get there, but people already brag about how long they sit in rush hour traffic jams, so never getting over 5 mph is not a real issue. Being bumper cars with half horsepower motors, they'll get pretty decent mileage, and if they hit another bumper car, well, no harm no foul, they are bumper cars after all. Something like bumper car RVs would be a sight to see...

    Starting Score:    1  point
    Moderation   +1  
       Interesting=1, Total=1
    Extra 'Interesting' Modifier   0  
    Karma-Bonus Modifier   +1  

    Total Score:   3