Stories
Slash Boxes
Comments

SoylentNews is people

posted by janrinok on Sunday August 09 2015, @06:53PM   Printer-friendly
from the do-you-remember-your-stopping-distances? dept.

Einstein once said, "Put your hand on a hot stove for a minute, and it seems like an hour. Sit with a pretty girl for an hour and it seems like a minute. THAT'S relativity."

So 5-8 seconds seems like a (relatively) short amount of time. But, is it enough to safely take back control of a self-driving car and negotiate a road hazard? And if the driver is given less time, is it better or worse? Researchers at Stanford attempted to find out:

In this study, we observed how participants (N=27) in a driving simulator performed after they were subjected to an emergency loss of automation. We tested three transition time conditions, with an unstructured transition of vehicle control occurring 2 seconds, 5 seconds, or 8 seconds before the participants encountered a road hazard that required the drivers' intervention.

Few drivers in the 2 second condition were able to safely negotiate the road hazard situation, while the majority of drivers in 5 or 8 second conditions were able to navigate the hazard safely.

Although the participants in the current study were not performing secondary tasks while the car was driving, the 2 second condition appeared to be insufficient. The participants did not perform well and liked the car less. Additionally, participants' comfort in the car was also lower in the 2 second condition. Hence, it is recommended to give warnings or relinquish control more than 2 seconds in advance. While not necessarily the minimum required time, 5 second condition from a critical event appeared to be sufficient for drivers to perform the take over successfully and negotiate the problem. While the results of this study indicated that there was a minimum amount of time needed for transition of control, this was true when the drivers only monitored the car's activity and did not perform secondary tasks. It is possible that these results can change if the drivers are occupied with other activities.

Full research paper available here.


Original Submission

 
This discussion has been archived. No new comments can be posted.
Display Options Threshold/Breakthrough Mark All as Read Mark All as Unread
The Fine Print: The following comments are owned by whoever posted them. We are not responsible for them in any way.
  • (Score: 3, Interesting) by Reziac on Monday August 10 2015, @03:29PM

    by Reziac (2489) on Monday August 10 2015, @03:29PM (#220723) Homepage

    Yeah, some people cannot learn that sort of decision-making, so best to simply avoid the issue in their case.

    I'm the opposite; my body reacts appropriately to the emergency before my brain consciously recognises it. Thus I've avoided a number of serious accidents. Probably the most notable being near head-ons thanks to CA's stupid incarnation of a "move over" law -- which is making people drive on the wrong side of busy two-lane roads to avoid a ticket, and trouble is some people are so worried about the potential ticket that they both neglect to notice oncoming traffic and kinda jerk the wheel over so they're suddenly in your lane and coming your way. So before I can even think about it, I'm driving down the grassy shoulder to avoid 'em (dodging posts on the way, but not going out so far that I get into soft ground and can't get back to the road). What would a self-driving car do in the same situation? would it obey the "move over" law? would it avoid the hazard of another self-driver that did so inappropriately? would it ditch sensibly based on terrain, or any damn place? How do you generalize this in software, since obviously you can't plan for every possible case? (At least, not with the variety of roads and terrain we have today. Seems to me they're trying to reinvent the railroad, minus the tracks.)

    It occurs to me to wonder about a cultural factor too: I watch a lot of those dashcam crash videos, and shake my head at messes I never saw even in 28 years driving in Los Angeles.... and while on the surface it looks like oblivious or bad driving, I think there's something else at work: these vids come very largely from parts of the world where the concept of personal right of way favors the oncoming person, so they just barrel on through any damn place, and you're expected to get out of their way (whereas it's the other way around in America). To explain in case I'm not clear, here if you're standing still and someone walks toward you, they're expected to go around you. There, you'd be expected to move out of their way. This may work all right on the sidewalk but doesn't work worth a damn with the momentum of multi-ton wheeled objects; it's not going to magically get out of your way. What happens when that cultural perception gets translated to a self-driving car?? or is that really what's, uh, driving the whole concept, the fact that the culture of "get outta my way" is unsuited to humans doing the driving?

    --
    And there is no Alkibiades to come back and save us from ourselves.
    Starting Score:    1  point
    Moderation   +1  
       Interesting=1, Total=1
    Extra 'Interesting' Modifier   0  
    Karma-Bonus Modifier   +1  

    Total Score:   3