Stories
Slash Boxes
Comments

SoylentNews is people

posted by janrinok on Sunday August 09 2015, @06:53PM   Printer-friendly
from the do-you-remember-your-stopping-distances? dept.

Einstein once said, "Put your hand on a hot stove for a minute, and it seems like an hour. Sit with a pretty girl for an hour and it seems like a minute. THAT'S relativity."

So 5-8 seconds seems like a (relatively) short amount of time. But, is it enough to safely take back control of a self-driving car and negotiate a road hazard? And if the driver is given less time, is it better or worse? Researchers at Stanford attempted to find out:

In this study, we observed how participants (N=27) in a driving simulator performed after they were subjected to an emergency loss of automation. We tested three transition time conditions, with an unstructured transition of vehicle control occurring 2 seconds, 5 seconds, or 8 seconds before the participants encountered a road hazard that required the drivers' intervention.

Few drivers in the 2 second condition were able to safely negotiate the road hazard situation, while the majority of drivers in 5 or 8 second conditions were able to navigate the hazard safely.

Although the participants in the current study were not performing secondary tasks while the car was driving, the 2 second condition appeared to be insufficient. The participants did not perform well and liked the car less. Additionally, participants' comfort in the car was also lower in the 2 second condition. Hence, it is recommended to give warnings or relinquish control more than 2 seconds in advance. While not necessarily the minimum required time, 5 second condition from a critical event appeared to be sufficient for drivers to perform the take over successfully and negotiate the problem. While the results of this study indicated that there was a minimum amount of time needed for transition of control, this was true when the drivers only monitored the car's activity and did not perform secondary tasks. It is possible that these results can change if the drivers are occupied with other activities.

Full research paper available here.


Original Submission

 
This discussion has been archived. No new comments can be posted.
Display Options Threshold/Breakthrough Mark All as Read Mark All as Unread
The Fine Print: The following comments are owned by whoever posted them. We are not responsible for them in any way.
  • (Score: 2) by VLM on Sunday August 09 2015, @07:25PM

    by VLM (445) on Sunday August 09 2015, @07:25PM (#220357)

    I wonder what fraction of the general population can handle an emergency in only 2 seconds. I've observed some pretty poor driving.

    • (Score: 0) by Anonymous Coward on Sunday August 09 2015, @07:34PM

      by Anonymous Coward on Sunday August 09 2015, @07:34PM (#220359)

      I'm pleased our overlords provide us the chance to kill ourselves so that the insurance won't pay out in our favor.

      Does the car relinquish control back to us when there are red and blue lights flashing behind us? How many seconds does that take?

    • (Score: 2, Interesting) by Nerdfest on Sunday August 09 2015, @07:35PM

      by Nerdfest (80) on Sunday August 09 2015, @07:35PM (#220360)

      I would think even two seconds to take over is way too long. I've avoided accidents where the only thing that saved me was video game reflexes; things like people puling out from a left turn lane into a through-lane, backing out at speed from between pocked cars blocking vision, etc. It's even worse when I'm on a motorcycle as sudden twitchy responses are almost as dangerous as the thing you're trying to avoid. Make no mistake, when many cars are autonomous there will still be accidents and almost all of them will be caused by human drivers.

      • (Score: 5, Insightful) by frojack on Sunday August 09 2015, @08:01PM

        by frojack (1554) on Sunday August 09 2015, @08:01PM (#220368) Journal

        You misunderstand the point of the study.

        In your cited examples you were driving. So right off the mark you can toss every one of your examples out the window. They are non-germane to the topic at hand. You were driving and were ready to handle emergencies, yet you still failed to anticipate possible bad actions on the part of other drivers. You therefore got yourself into a twitch situation, which defensive driving training would have kept you out of.

        The thing is, a driverless car should not get two seconds to impact, and then hand over control.
        You could make the case that it shouldn't allow the situation to get two seconds to impact AT ALL.
        But if they do, they should handle the emergency, rather than handing it off to a human.

        If the human wasn't driving, and the car was, just getting the human brain into driving mode will take more than two seconds. That is what this study shows.

        if automated cars only manage to hand you an emergency situation two seconds away they will be FAR more dangerous than the vast majority human drivers, (most of which manage to drive years on end without an accident). It takes humans a while to switch chains of thought. This study suggests that 5 seconds is a minimum
        necessary for a human to take over.

        If the autonomous driving mode reaches an emergency state where impact is two seconds away, a controlled crash avoidance and shutdown routine should take over. You can't hand a human a two seconds to impact scenario, unless they were attentively monitoring the vehicles progress.

        --
        No, you are mistaken. I've always had this sig.
        • (Score: 0) by Anonymous Coward on Sunday August 09 2015, @08:17PM

          by Anonymous Coward on Sunday August 09 2015, @08:17PM (#220373)

          You mean a routine to minimize injury when the crash does happen. Perhaps the routine should warn passengers of the imminent danger so they could brace for impact.

          Terminator scenario, the cars start acting to protect themselves even against their programming.

          • (Score: 2, Funny) by khallow on Monday August 10 2015, @04:32AM

            by khallow (3766) Subscriber Badge on Monday August 10 2015, @04:32AM (#220548) Journal
            I suppose it's long enough to give religion a try. There's not much in the way of bracing for impact that you can do, aside perhaps from pulling your arms in towards your body. I have discovered those things snap easy in an accident.
            • (Score: 0) by Anonymous Coward on Monday August 10 2015, @04:34PM

              by Anonymous Coward on Monday August 10 2015, @04:34PM (#220745)

              Perhaps studies can be done to determine the most favorable position to be in.

        • (Score: 5, Insightful) by sjames on Sunday August 09 2015, @11:52PM

          by sjames (2882) on Sunday August 09 2015, @11:52PM (#220483) Journal

          More to the point, even when the 'drivers' were doing nothing but watch the car drive, 2 seconds wasn't enough. Five seconds was just enough when the 'driver' was watching the car drive attentively but 8 was better.

          But that's not the real world at all. In the real world, the driver will be on the phone, reading a book, trancing out to music, or asleep. The reality is that the autonomous system MUST be designed to handle the situation no matter what, there is no help to be had from a human. If, for some reason, the automation isn't going to be able to manage, it must be prepared to pull over and come to a complete safe stop before a human can be counted on to (eventually) take over.

          • (Score: 3, Insightful) by Nuke on Monday August 10 2015, @09:30AM

            by Nuke (3162) on Monday August 10 2015, @09:30AM (#220602)

            In the real world, the driver will be on the phone, reading a book, trancing out to music, or asleep.Indeed.

            Indeed. Personally, I would probably not be, but plenty of drivers will. Anyway, the idea of a driver (or operator of any machinery) having to watch an automated process and constantly make judgements, as opposed to fully controlling it, ready to intervene with a degree of readiness measured in seconds - as a matter of life or death - is absolutely crazy. It must also be a recipe for reducing people to nervous wrecks, if, if, they do it diligently. Like "Shall I intervene? Shan't I intervene? Shall I intervene? Shan't I intervene? ......."

            It is possible that these results can change if the drivers are occupied with other activities.

            I submit that for the "Understatement of the Year" award.

            • (Score: 3, Interesting) by VLM on Monday August 10 2015, @11:31AM

              by VLM (445) on Monday August 10 2015, @11:31AM (#220630)

              Anyway, the idea of a driver (or operator of any machinery) having to watch an automated process and constantly make judgements, as opposed to fully controlling it, ready to intervene with a degree of readiness measured in seconds - as a matter of life or death - is absolutely crazy.

              Well, that's an industrial factory line machine operator position. As fewer people are involved in manufacturing and the illegals take over, its no surprise that kind of work is missed from consciousness. Observationally people either freak out and burn out in a week or they get used to the idea of not giving much of a F and last until the next accident, at which time they get fired for not giving a F and get another minimum wage machine operator job at a competitor. Essentially they're being paid minimum wage to be scapegoats for software failures. I suppose there are white collar jobs like that too.

              Anyway the point is that people taking over after a machine fails doesn't really work all that often. There is an interesting related strategy for self driving cars. So last night I drove home, ate dinner, and then I played minecraft for a bit and then go to bed. So rather than driving home at 80 MPH for 20 miles why not drive home at 25 MPH on side streets etc and stay off the dangerous interstate and let the chips fall where they may as I play minecraft and maybe eat a packed dinner in my car, maybe take a nap? Its hard to kill the passengers in a car in a collision "well under 25 MPH" I suppose the most likely cause of death would be the automated car driving off bridges or maybe drunks in hand driven cars running red lights at 100 MPH. Lets say my car (not I) hit a parked car at 25 MPH which would pretty well total both cars but is unlikely to kill or even injure anyone. I can go all machine operator "who cares" and let the black box and insurance company fight Toyota's software engineers all they want, I just don't need to care. The ideal self driving commuter car would probably look a lot more like an enclosed golf cart on a side street than a giant (empty) SUV on the interstate.

              Going further, imagine the roads filled with bumper cars. Sure it takes "awhile" to get there, but people already brag about how long they sit in rush hour traffic jams, so never getting over 5 mph is not a real issue. Being bumper cars with half horsepower motors, they'll get pretty decent mileage, and if they hit another bumper car, well, no harm no foul, they are bumper cars after all. Something like bumper car RVs would be a sight to see...

            • (Score: 2) by sjames on Thursday August 13 2015, @04:57AM

              by sjames (2882) on Thursday August 13 2015, @04:57AM (#222124) Journal

              Even if they do care, they will inevitably fail eventually. Even when the human is actually driving the car, highway hypnosis is a thing. It can only get worse when the 'driver' doesn't even have to hold the steering wheel.

        • (Score: 2) by Nerdfest on Monday August 10 2015, @02:07PM

          by Nerdfest (80) on Monday August 10 2015, @02:07PM (#220682)

          Defensive driving only goes so far. Someone randomly pulling out of a turn lane 5 feet in front of you when they shouldn't is only really avoidable by reducing your speed where it would cause serious traffic problems every time you encounter another vehicle. It can solve most situations but not all.

      • (Score: 3, Insightful) by VLM on Sunday August 09 2015, @08:26PM

        by VLM (445) on Sunday August 09 2015, @08:26PM (#220377)

        video game reflexes

        You and I have video game reflexes, its the other 90% of the population I'm worried about. And multiply that by the fraction of the cars on the road having drivers who are high, drunk, or sleepy. Well, my car will drive me home, nobody will be able to tell if I have that extra beer, I can yell at the kids in the back seat for 30 seconds after all the car is driving not me...

        there will still be accidents and almost all of them will be caused by human drivers

        I think you misspelled software bugs. Also getting powned by viruses and miscreants and governments. Its interesting to think about swarm issues, yes genetically identical instinctual swarms usually don't collide very often, but multiple cars of various levels of dirt on cameras and completely different implementations of algos not to mention strange latency and jitter in response rates, combined with unpredictable "butterfly wing flapping" chaos road conditions (so a bird in the road in California leads domino like to a car in Florida slipping off the road in to the ocean the next day because of an unknown oil slick along the lines of the butterfly analogy in weather forecasting)

        • (Score: 2) by Reziac on Monday August 10 2015, @04:16AM

          by Reziac (2489) on Monday August 10 2015, @04:16AM (#220544) Homepage

          "completely different implementations of algos" -- Windows Common Files in action!

          --
          And there is no Alkibiades to come back and save us from ourselves.
      • (Score: 0) by Anonymous Coward on Sunday August 09 2015, @08:36PM

        by Anonymous Coward on Sunday August 09 2015, @08:36PM (#220381)

        This would be a serious problem. When I was learning how to drive almost 20 years ago the school taught defensive driving because 2 seconds really isn't enough time. When driving you need to be watching out for dangerous situations and avoiding as much of it as possible.

        I'm guessing that rather than switching to manual controls, in most cases just pressing a button to tell the car to stop would be the appropriate course of action. The car behind you ought to automatically come to a stop as well. Now, if more than one car is malfunctioning, I'm not sure that taking the controls yourself several minutes later is going to help.

    • (Score: 2) by davester666 on Monday August 10 2015, @07:38AM

      by davester666 (155) on Monday August 10 2015, @07:38AM (#220575)

      And, of course, it is entirely unsurprising to find people don't like a car that tells them "whups, I guess it's time for you to take over" too late for them to be able to deal with the situation.

  • (Score: 2) by AnonTechie on Sunday August 09 2015, @07:40PM

    by AnonTechie (2275) on Sunday August 09 2015, @07:40PM (#220361) Journal

    What happens when you are distracted ... listening to music, watching something on your laptop, sending SMS on your cell phone ...

    --
    Albert Einstein - "Only two things are infinite, the universe and human stupidity, and I'm not sure about the former."
    • (Score: 1) by khallow on Sunday August 09 2015, @09:47PM

      by khallow (3766) Subscriber Badge on Sunday August 09 2015, @09:47PM (#220422) Journal
      Or you popped the seatbelt, went over the seat, and are now rumaging through the cooler in the back? Yea, two seconds sounds more than adequate.
    • (Score: 2) by c0lo on Monday August 10 2015, @01:22AM

      by c0lo (156) Subscriber Badge on Monday August 10 2015, @01:22AM (#220500) Journal

      What happens when you are distracted ... listening to music, watching something on your laptop, sending SMS on your cell phone ...

      You die.

      --
      https://www.youtube.com/watch?v=aoFiw2jMy-0 https://soylentnews.org/~MichaelDavidCrawford
  • (Score: 4, Insightful) by Pslytely Psycho on Sunday August 09 2015, @07:47PM

    by Pslytely Psycho (1218) on Sunday August 09 2015, @07:47PM (#220364)

    "While the results of this study indicated that there was a minimum amount of time needed for transition of control, this was true when the drivers only monitored the car's activity and did not perform secondary tasks. It is possible that these results can change if the drivers are occupied with other activities."

    If you need to monitor your self-driving car in case this happens, then you might as well be driving it yourself.

    Hmmm, if you're drunk, and it's driving you home and then requires you to take the wheel in an emergency, could you be charged with a DUI or manslaughter when you invariably make the wrong choice and crash/kill someone? How about if you're asleep? Or fail to respond at all because you don't actually know how to drive because a computer has been doing it for you since you got your license and your only actual driving experience is GTA IX? Liability lawyers are going to get very rich.

    Either you drive or you don't. If the computer can't handle the situation, and is unable to fail safe, then it is not going to help putting a (likely) inexperienced driver in a potentially life threatening situation. Even the most experienced driver would struggle to cope with "Take over manual control in 5,4,3,2,1 now." Especially if they were sleeping, intoxicated, making out, reading or any other distraction at all. Road hazards happen quickly, and you don't get a second chance. And with more than a five second warning the car/truck should be able to come to a controlled stop by itself. It only takes 6 seconds (dry roads, level surface, good equipment) to stop a fully loaded semi-truck (lorry for EU folks), so warnings longer than a few seconds shouldn't even occur.

    Perhaps I'm over critical as I train professional drivers for a living, but I sure the hell have no interest in being chauffeured by silicon. I love to drive. Trucks during the workweek and the 'Vette on the weekends.

    --
    Alex Jones lawyer inspires new TV series: CSI Moron Division.
    • (Score: 2) by frojack on Sunday August 09 2015, @08:10PM

      by frojack (1554) on Sunday August 09 2015, @08:10PM (#220371) Journal

      Pretty much spot on.

      Actually, if the computer can detect a manual intervention requirement 5 seconds away, it should probably tinitiate emergency avoidance measures rather than handing it to a human.

      If the human was attentive and monitoring progress, they can over-ride, take control and avoid the situation themselves.

      But unless the computer can somehow know this, it should just fail-safe.

      --
      No, you are mistaken. I've always had this sig.
    • (Score: 1) by khallow on Sunday August 09 2015, @11:48PM

      by khallow (3766) Subscriber Badge on Sunday August 09 2015, @11:48PM (#220481) Journal

      If you need to monitor your self-driving car in case this happens, then you might as well be driving it yourself.

      There's some point to moderate automation. Cruise control works well, for example.

      • (Score: 2) by q.kontinuum on Monday August 10 2015, @02:25AM

        by q.kontinuum (532) on Monday August 10 2015, @02:25AM (#220515) Journal

        Problem is, if you have to pay attention all the time, it is easier to drive yourself. Otherwise you will (or at least I would) inadvertently slack of, think of other things etc., and in that case even the 5 seconds were not realistic. So, I would second the point: Either driving is automated enough to contain a fail-safe, were the car pulls over and stops, giving the driver enough time to wake up. Or it isn't automated. It could still be assisted (distance guard, lane guard with vibrating effect if you accidentally leave your lane, speed limit warning, parking-aid, etc.). Cruise control is already dangerous in my opinion, since it will maintain the current speed even if the driver doses off. Responsibility would clearly rest with the driver all the time.

        --
        Registered IRC nick on chat.soylentnews.org: qkontinuum
        • (Score: 1) by khallow on Monday August 10 2015, @04:28AM

          by khallow (3766) Subscriber Badge on Monday August 10 2015, @04:28AM (#220547) Journal

          Cruise control is already dangerous in my opinion, since it will maintain the current speed even if the driver doses off.

          So will the lack of cruise control. I don't see a significant drop in speed happening before the car and whatever it hits/lands on develops severe repair issues.

      • (Score: 2) by c0lo on Monday August 10 2015, @02:26AM

        by c0lo (156) Subscriber Badge on Monday August 10 2015, @02:26AM (#220516) Journal

        If you need to monitor your self-driving car in case this happens, then you might as well be driving it yourself.

        There's some point to moderate automation. Cruise control works well, for example.

        Until it doesn't work as expected [foxnews.com] or it's defective and you burn [jalopnik.com].

        --
        https://www.youtube.com/watch?v=aoFiw2jMy-0 https://soylentnews.org/~MichaelDavidCrawford
        • (Score: 2, Interesting) by khallow on Monday August 10 2015, @04:24AM

          by khallow (3766) Subscriber Badge on Monday August 10 2015, @04:24AM (#220545) Journal
          What doesn't have those problems in a car? Classic examples are the brake and accelerator pedals.
      • (Score: 3, Insightful) by TheRaven on Monday August 10 2015, @08:47AM

        by TheRaven (270) on Monday August 10 2015, @08:47AM (#220593) Journal
        Cruise control is useful, but there are fairly strict limits. Safety doesn't have a simple correlation automation. With no automation, your entire attention is focused on the task and you're pretty likely to handle unusual conditions. With full automation, you don't need any attention on the task and it's pretty safe.

        When you have a little bit of automation, like cruise control, it can mean that you've removed some things from your locus of attention but all of the safety-critical things are still there and get more of your attention. Now imagine cruise control taken a bit further, so it will also do lane tracking and keep you in lane. All that you need to do is steer when the road markings aren't clear or when there's an unexpected obstacle. Now steering is no longer part of your attention and so it takes a little while to bring it back into your awareness. The time when you actually need to be involved in the process is when you are least mentally prepared to be.

        Various forms of computer assistance in cars are likely to be a lot less safe than either a fully manual or a fully automatic vehicle.

        --
        sudo mod me up
      • (Score: 2) by Nuke on Monday August 10 2015, @09:55AM

        by Nuke (3162) on Monday August 10 2015, @09:55AM (#220611)

        There's some point to moderate automation. Cruise control works well, for example.

        Personally, I've had a couple of close calls with cruise control.

        Whereas approaching, say a stationary queue of traffic ahead, I normally ease off the throttle and then brake to a comfortable and "perfect" stop without even thinking about it, practically by instinct, with cruise I find I need to think more consciously about when to knock it off, and did not get it comfortably right at first. Now, after those experiences, I knock off the cruise as soon as there is any sign of a possible need to slow ahead, including above a certain density of traffic, long before any need to brake appears. I wonder about other people using cruise though.

        I have heard stories about 1950's Cadillacs (the first with cruise control?), when cars had bench front seats, of drivers sitting sideways with legs up along the seat, back against the door, steering with their left hand. There is also a story in the UK of a guy who used to drive his Morris 8, which had a throttle that could be set, sitting on the roof, legs down through the open sunroof, steering with his feet. [Ref : "Signalman's Morning" by Adrian Vaughan].

    • (Score: 2) by tangomargarine on Monday August 10 2015, @12:15AM

      by tangomargarine (667) on Monday August 10 2015, @12:15AM (#220493)

      It only takes 6 seconds (dry roads, level surface, good equipment) to stop a fully loaded semi-truck (lorry for EU folks), so warnings longer than a few seconds shouldn't even occur.

      Perhaps I'm over critical as I train professional drivers for a living, but I sure the hell have no interest in being chauffeured by silicon. I love to drive. Trucks during the workweek and the 'Vette on the weekends.

      Semi stopping from what speed? I'd love to see video evidence of a semi going from 70mph to 0 in 6 seconds.

      --
      "Is that really true?" "I just spent the last hour telling you to think for yourself! Didn't you hear anything I said?"
      • (Score: 3, Informative) by Pslytely Psycho on Monday August 10 2015, @01:52AM

        by Pslytely Psycho (1218) on Monday August 10 2015, @01:52AM (#220507)

        Ooops. Proofread fail.

        We teach a minimum of 6 seconds Following Distance (10 preferred).
        It is around 12-14 seconds to stop from 65 mph. (Emergency stop.)

        I substituted following distance for stopping distance.
        my bad.

        --
        Alex Jones lawyer inspires new TV series: CSI Moron Division.
    • (Score: 2) by Reziac on Monday August 10 2015, @04:14AM

      by Reziac (2489) on Monday August 10 2015, @04:14AM (#220543) Homepage

      Another problem with self-driving cars is that it reduces the net amount of driver experience. (For some people, this might become effectively zero driving experience.) How is reduced experience going to improve their reaction time and quality of response when they're required to take over??

      Methinks the "flustered effect" will be a lot more deadly than this study gives it credit for. Some people don't recover from startlement in minutes, let alone seconds. And people become a lot more flustered when they have to jump into a task than if they're already immersed in it.

      I too enjoy driving, but even if that weren't the case... I am very skeptical. I think self-driving cars may look good on a small scale, but on a large scale have a much larger potential for disaster than do human drivers.

      --
      And there is no Alkibiades to come back and save us from ourselves.
      • (Score: 1) by khallow on Monday August 10 2015, @04:36AM

        by khallow (3766) Subscriber Badge on Monday August 10 2015, @04:36AM (#220550) Journal
        And some people will be flustered for days not minutes even when they do know how to drive.
        • (Score: 2) by Reziac on Monday August 10 2015, @05:05AM

          by Reziac (2489) on Monday August 10 2015, @05:05AM (#220553) Homepage

          True, because the next time they get startled by their car, their reaction will be extended flusterment, rather than situation-appropriate. It's like the more they get startled, the more flustered they get. Yeah, it's a failure of adaptability, but it's not so rare that it should be ignored.

          --
          And there is no Alkibiades to come back and save us from ourselves.
      • (Score: 2) by Pslytely Psycho on Monday August 10 2015, @09:20AM

        by Pslytely Psycho (1218) on Monday August 10 2015, @09:20AM (#220600)

        " I think self-driving cars may look good on a small scale, but on a large scale have a much larger potential for disaster than do human drivers. "

        Yes, they are going to have to become so good that having to take over manually never happens. Maybe to the point of no manual controls at all. Many people will simply freeze and make no decision out of fear of making the wrong decision. A trait that usually results in me failing a student driver for their (and the general public at large ) own good. I failed about 10% on that alone, in an emergency you must make a decision. Whether it is good or bad is a combination of knowledge, skill and luck (sometimes there is no good decision). I tried to impart as much skill and knowledge as I could, but there is a subset of people that have no business behind the wheel of any vehicle, much less one that can weigh 80,000 lbs.

        Better to have the vehicle fail-safe to the side of the road than to have a potentially unskilled person 'take over' during an emergency.

        --
        Alex Jones lawyer inspires new TV series: CSI Moron Division.
        • (Score: 3, Interesting) by Reziac on Monday August 10 2015, @03:29PM

          by Reziac (2489) on Monday August 10 2015, @03:29PM (#220723) Homepage

          Yeah, some people cannot learn that sort of decision-making, so best to simply avoid the issue in their case.

          I'm the opposite; my body reacts appropriately to the emergency before my brain consciously recognises it. Thus I've avoided a number of serious accidents. Probably the most notable being near head-ons thanks to CA's stupid incarnation of a "move over" law -- which is making people drive on the wrong side of busy two-lane roads to avoid a ticket, and trouble is some people are so worried about the potential ticket that they both neglect to notice oncoming traffic and kinda jerk the wheel over so they're suddenly in your lane and coming your way. So before I can even think about it, I'm driving down the grassy shoulder to avoid 'em (dodging posts on the way, but not going out so far that I get into soft ground and can't get back to the road). What would a self-driving car do in the same situation? would it obey the "move over" law? would it avoid the hazard of another self-driver that did so inappropriately? would it ditch sensibly based on terrain, or any damn place? How do you generalize this in software, since obviously you can't plan for every possible case? (At least, not with the variety of roads and terrain we have today. Seems to me they're trying to reinvent the railroad, minus the tracks.)

          It occurs to me to wonder about a cultural factor too: I watch a lot of those dashcam crash videos, and shake my head at messes I never saw even in 28 years driving in Los Angeles.... and while on the surface it looks like oblivious or bad driving, I think there's something else at work: these vids come very largely from parts of the world where the concept of personal right of way favors the oncoming person, so they just barrel on through any damn place, and you're expected to get out of their way (whereas it's the other way around in America). To explain in case I'm not clear, here if you're standing still and someone walks toward you, they're expected to go around you. There, you'd be expected to move out of their way. This may work all right on the sidewalk but doesn't work worth a damn with the momentum of multi-ton wheeled objects; it's not going to magically get out of your way. What happens when that cultural perception gets translated to a self-driving car?? or is that really what's, uh, driving the whole concept, the fact that the culture of "get outta my way" is unsuited to humans doing the driving?

          --
          And there is no Alkibiades to come back and save us from ourselves.
  • (Score: 0) by Anonymous Coward on Sunday August 09 2015, @07:48PM

    by Anonymous Coward on Sunday August 09 2015, @07:48PM (#220365)

    How about a button I can push and let the computer get me out of an emergency situation. Why should I be subservient to the computer and forced to take over control at its whim? It should be the other way around...

    • (Score: 0) by Anonymous Coward on Monday August 10 2015, @10:26AM

      by Anonymous Coward on Monday August 10 2015, @10:26AM (#220618)

      How about if I don't have to press such a button, but the computer simply continues to control the car unless I explicitly take over?

  • (Score: 0) by Anonymous Coward on Sunday August 09 2015, @08:22PM

    by Anonymous Coward on Sunday August 09 2015, @08:22PM (#220375)

    The only way to make autonomous cars safe from regular driver cars would be to separate them, either via secured lanes or entire roads.

    • (Score: 1) by khallow on Sunday August 09 2015, @11:50PM

      by khallow (3766) Subscriber Badge on Sunday August 09 2015, @11:50PM (#220482) Journal
      Why would we want that? If your autonomous car needs to be "safe" from regular drivers, then it probably shouldn't be on the road at all.
  • (Score: 4, Insightful) by TrumpetPower! on Sunday August 09 2015, @08:27PM

    by TrumpetPower! (590) <ben@trumpetpower.com> on Sunday August 09 2015, @08:27PM (#220378) Homepage

    This is a situation where the greatest danger lies in the middle ground.

    There are two ideals: the human is in full control of the car's motion with no reliance upon automated navigation, or the car is fully automated. And, obviously, the first ideal assumes that the human is devoting full attention to the act of maneuvering the car.

    It's that obvious caveat that points to the perils of semi-autonomous driving. If the car is going to automatically slam on the brakes to prevent you from rear-ending the car in front of you, what incentive do you have to try to avoid rear-end collisions? That's the car's problem. But, rather than devote the attention you'd normally use to avoiding rear-end collisions to avoiding other types of collisions...people turn that attention to texting or eating or grooming or what-not. And, human cognition being the single-tasking context-switching phenomenon it is, the fact that you're now paying more attention to twittering about how neat it is that you don't have to worry about rear-ending somebody also means that you're not paying enough attention to the rest of the hazards on the road.

    One should be wary of black-and-white all-or-nothing characterizations...but I really think this is just such a case. Overwhelmingly, most people should be in robot cars; if you're not going to give your full attention to driving, you and everybody around you will be much better off with the car devoting its full attention to driving. And that means no arguing with the radio, no touching base with clients over the speakerphone, no daydreaming about what you'd like to do with that cute cow-orker. Instead, you need to be looking for driveways where somebody might pull out suddenly with an eye towards how you'd evade, spotting which drivers are being more aggressive and likely to cut you off, searching for idiot bicyclists riding on the sidewalk against traffic at the intersection where you're about to turn and would otherwise blindly run them over...all that sort of thing, all the time.

    That's what defines good driving -- not mere car handling ability, which, of course, is also important. But driving the optimum line at the traction limit around a corner means fuck-all if you didn't give yourself enough room to avoid running over that kid who chased the ball into the street from behind the fence. And, last I checked, the way robot cars are designed to avoid that sort of idiocy isn't with some sort of super-fast reflexes...but by not putting themselves into a situation where such super-fast reflexes are needed in the first place. You know? Like what good human drivers do?

    b&

    --
    All but God can prove this sentence true.
    • (Score: 3, Insightful) by theluggage on Sunday August 09 2015, @09:25PM

      by theluggage (1797) on Sunday August 09 2015, @09:25PM (#220408)

      This is a situation where the greatest danger lies in the middle ground.

      Absolutely. Who'd even want a self-driving car if you were expected to sit there with your full attention on the road? If I can't kick back and forget about driving, I'd rather drive - its the constant attention and needing eyes in the back of your head that is tiring, not turning the wheel.

      In any case, long-term, if self-driving cars take off you'll end up with "drivers" who hadn't actually driven for years, if at all, and are going to be even worse at taking over in emergencies. Self-driving cars are a great idea, but they're not ready until they are ready - which means that they need to be better at dealing with obstacles than a human driver. Until then, its probably better to have the auto-drive system as the helper that takes over when the human can't cope (as is already happening with auto-breaking, self parking etc.)

      The whole liability thing will have to be legislated for, too.

      Personally, I'd rather my indicators didn't self-cancel (e.g. after you've turned into the filter lane but before you reach the actual turn) - and the anti-rollback device in my car that sometimes stops it rolling back on a slope if your clutch control isn't quite up to scratch is an accident waiting to happen if you get to depend on it.

      • (Score: 2) by TrumpetPower! on Sunday August 09 2015, @09:43PM

        by TrumpetPower! (590) <ben@trumpetpower.com> on Sunday August 09 2015, @09:43PM (#220419) Homepage

        Self-driving cars are a great idea, but they're not ready until they are ready - which means that they need to be better at dealing with obstacles than a human driver.

        That's a very low hurdle which we've almost certainly passed.

        Self-driving cars don't need to be able to drive better than Mario Andretti on the morning of race day after a good breakfast and coffee.

        They just need to be able to drive as well as the average driver taking the license test at the DMV.

        Because, you see, that's the best that your average human ever actually does drive...and your average human is going to spend a significant fraction of time behind the wheel driving much worse than that -- distracted, sleepy, drunk, suffering from testosterone poisoning, you name it. The robot, on the other hand, is always going to be driving at its peak ability. So, even if that's not enough to avoid all potential crash situations, it's enough to avoid the ones most people would avoid when taking the driving test -- which is the standard we already accept as the maximum (not minimum!) for motor vehicle operation safety today. Simply making that maximum the new minimum would result in a massive increase in safety.

        b&

        --
        All but God can prove this sentence true.
        • (Score: 2) by frojack on Monday August 10 2015, @02:33AM

          by frojack (1554) on Monday August 10 2015, @02:33AM (#220519) Journal

          Self-driving cars don't need to be able to drive better than Mario Andretti on the morning of race day after a good breakfast and coffee.
          They just need to be able to drive as well as the average driver taking the license test at the DMV.
          Because, you see, that's the best that your average human ever actually does drive.

          What drivel. Really, where do you come up with such nonsense.

          Just a casual look at accident rates by age would teach you that experience counts for a LOT.
          Newly licensed drivers are pretty much expected to be high risk drivers. Experienced drivers
          are expected to have far far fewer accidents.

          Some states are starting to restrict new drivers as to hours of the day they can drive, number of other young people in the vehicle, etc. Why? Because accident statistics say new drivers are inexperienced, and have more fender benders (to say nothing about fatalities).

          --
          No, you are mistaken. I've always had this sig.
          • (Score: 2) by gnuman on Monday August 10 2015, @02:55AM

            by gnuman (5013) on Monday August 10 2015, @02:55AM (#220523)

            Just a casual look at accident rates by age would teach you that experience counts for a LOT.
            Newly licensed drivers are pretty much expected to be high risk drivers. Experienced drivers
            are expected to have far far fewer accidents.

            That's less to do with experience and more to do with "showing off" and over-confidence.

            • (Score: 1) by khallow on Monday August 10 2015, @04:39AM

              by khallow (3766) Subscriber Badge on Monday August 10 2015, @04:39AM (#220551) Journal

              That's less to do with experience and more to do with "showing off" and over-confidence.

              In other words, lack of experience.

          • (Score: 2) by TrumpetPower! on Monday August 10 2015, @03:48AM

            by TrumpetPower! (590) <ben@trumpetpower.com> on Monday August 10 2015, @03:48AM (#220538) Homepage

            So the only time you've ever taken a driving exam was when you first got your license? And you don't see a problem with a system in which somebody could go half a century without being re-certified?

            Would you get on an aircraft if the pilot didn't have a current BFR [aopa.org]? Would you get on a bus if the driver's CDL had expired?

            If not, what makes you think anybody else should be operating heavy transportation machinery without regular recertification?

            b&

            --
            All but God can prove this sentence true.
        • (Score: 3, Insightful) by theluggage on Monday August 10 2015, @10:03AM

          by theluggage (1797) on Monday August 10 2015, @10:03AM (#220613)

          Self-driving cars don't need to be able to drive better than Mario Andretti on the morning of race day after a good breakfast and coffee.

          I don't think you're talking about a comparable skill-set there.

          They just need to be able to drive as well as the average driver taking the license test at the DMV.

          Now, that is too low a bar. Self-driving vehicles won't be accepted by drivers, or by the law, unless they can prove that they are significantly safer than human drivers.

          One day, a self-driving vehicle is going to kill someone - its inevitable, even if they are safer than 95% of drivers, even if the victim deserved a Darwin award and 0% of human drivers would have avoided the accident, it will happen and there will be huge scrutiny of self-driving cars. At that point, if self-driving cars are to survive, there will have to be clear-cut evidence that they are, overall, an overwhelming safety win.

          If not, what makes you think anybody else should be operating heavy transportation machinery without regular recertification?

          They shouldn't. Driver certification is a joke that only serves to exclude the spectacularly incompetent and the unlucky. In any sane world, drivers would be re-tested every 10 years and getting a license would require logging tens of hours supervised driving covering all road types and conditions. The problem is, we have developed a society where, in many areas, a non-driver is a second-class citizen so it is politically and economically unacceptable to make the qualification to drive too arduous. Society is a bit more picky about who it lets drive a bus full of potential lawsuits.

          Self-driving cars won't get given the same leeway - they'll need the same sort of certification as mass-transport drivers (from the government POV they will be mass transit drivers).

          Would you get on a bus if the driver's CDL had expired?

          Would you a car with a driver who got an 'average' score on their test the previous month? Maybe - but only because it would be rude not to. People won't be so worried about being rude to computers - in fact, they will be more inclined to suspicion. People, and the law, won't accept self-driving cars without proof that they are overwhelmingly safer than human drivers. Even then, its going to be touch and go the first time an autonomous vehicle kills someone (and that will happen) and the media and ambulence-chasers get on the case.

             

    • (Score: 0) by Anonymous Coward on Sunday August 09 2015, @09:30PM

      by Anonymous Coward on Sunday August 09 2015, @09:30PM (#220411)

      That's not true. Even right now as you drive a car there are computers that take some control of certain aspects of your driving. Anti-lock brakes is the most obvious example. Other examples including speed limiters/ throttle control. Cruise control. Cars that can automatically park themselves. There are all kinds of middle ground possibilities and driving already involves the participation of computers. It just needs to be done wisely, with testing.

    • (Score: 1) by patrick on Monday August 10 2015, @03:29AM

      by patrick (3990) on Monday August 10 2015, @03:29AM (#220535)

      Google's head of their driverless car program agrees with you [ted.com].

  • (Score: 0) by Anonymous Coward on Sunday August 09 2015, @08:42PM

    by Anonymous Coward on Sunday August 09 2015, @08:42PM (#220384)

    Sit with a pretty girl for an hour and it seems like a minute.

    Quote is not socially correct for this century. Replace the girl with a gay dude.

  • (Score: 2) by darkfeline on Sunday August 09 2015, @10:59PM

    by darkfeline (1030) on Sunday August 09 2015, @10:59PM (#220463) Homepage

    An autonomous car shouldn't ever just hand over control. The car should be prepared to handle ALL situations, giving over control immediately IF the driver requests it by, e.g., grabbing the steering wheel or pressing the gas/break pedals. The time it takes to "take back control", then, is a meaningless figure here.

    Can you imagine going down the highway in your autonomous car, chatting with a friend, when your car says "Nope, I'm outta here, you handle this"?

    --
    Join the SDF Public Access UNIX System today!
    • (Score: 1) by khallow on Monday August 10 2015, @04:41AM

      by khallow (3766) Subscriber Badge on Monday August 10 2015, @04:41AM (#220552) Journal

      Can you imagine going down the highway in your autonomous car, chatting with a friend, when your car says "Nope, I'm outta here, you handle this"?

      I sure can. Wheels in the air before you can say "Huh?"

  • (Score: 3, Interesting) by tangomargarine on Monday August 10 2015, @12:03AM

    by tangomargarine (667) on Monday August 10 2015, @12:03AM (#220488)

    The Four-Second Rule (or whatever number they're using now) springs to mind: At highway speeds, leave at least 4 seconds' distance between you and the vehicle in front of you.

    Of course, nobody ever does around where I live. Give me 2 seconds' reaction time even when *I'm* driving and it's a nontrivial exercise what happens.

    P.S: Looking it up online, the Wikipedia article calls it Two Seconds [wikipedia.org]. The summary seems to support my suspicion that we shouldn't be telling people 2 seconds in the first place.

    --
    "Is that really true?" "I just spent the last hour telling you to think for yourself! Didn't you hear anything I said?"
  • (Score: 4, Insightful) by gman003 on Monday August 10 2015, @05:39AM

    by gman003 (4155) on Monday August 10 2015, @05:39AM (#220556)

    IMO, if an autonomous car needs to hand over control, it should engage the following procedures:
    1. Sound an audible alarm in the cabin, lock seat belts, and engage emergency signals. If able, signal other autonomous cars that you are in panic mode and should be treated as an obstacle, not a fellow autonomous vehicle.
    2. Engage brakes as fully as possible without losing traction. This is based on the principle that lower-energy crashes are universally preferable to high-energy crashes.
    3. Fall back to collision-avoidance mode - don't try to follow traffic rules, just do everything possible to not hit anything while stopping.
    4. Remain stationary until a human switches the vehicle to manual operation. Inputs should not be blindly accepted before the mode switch, because humans have an unfortunate tendency to panic, which would likely cause more problems than it could solve.

    Autonomous vehicles, at least the early-generation ones, will have to be able to deal with "not being able to deal with the current conditions". Unmarked pavement or dirt roads or whatever can be expected to break autonomy - I've seen human drivers who can't handle some conditions I've driven in, and I've had to stop driving because conditions became too bad for *me* to safely continue.. Coming to a safe stop and letting the human take over until conditions return to normal is the best course of action. And it is also the same course of action that should be taken for a more sudden emergency - anything from "tree falling into roadway" to "truck full of acetylene cylinders flips and catches fire" to "the oil pan literally dropped off the car, we're stopping no matter what eventually" (I've had two of those happen to me and have seen Russian dashcam footage of the other). What do you do? Stop, ASAP, and try not to hit anything while doing so. We can't predict every emergency, but any emergency where stopping is *not* part of the solution is a pretty contrived emergency.

    • (Score: 2) by Pslytely Psycho on Monday August 10 2015, @09:42AM

      by Pslytely Psycho (1218) on Monday August 10 2015, @09:42AM (#220606)

      "We can't predict every emergency, but any emergency where stopping is *not* part of the solution is a pretty contrived emergency."

      As a professional driving instructor, this is the bottom line. And believe me, I have heard every bizarre attempt to justify not stopping in an emergency. "If that guy there came flying out of the parking lot I could speed up, change lanes...." (interrupted by me) "Hit a minimum of two cars possibly killing someone, why not just FUCKING STOP?!" I actually had this conversation once.....still, a hypothetical is better than me yelling "STOP!" at the top of my lungs to get the reply..."Why?" Crunch. forty thousand dollar traffic signal crashes into intersection. "That's why." Some people should not drive ever.....
      (sorry, got a bit off-topic)

      --
      Alex Jones lawyer inspires new TV series: CSI Moron Division.
    • (Score: 0) by Anonymous Coward on Monday August 10 2015, @10:31AM

      by Anonymous Coward on Monday August 10 2015, @10:31AM (#220619)

      If able, signal other autonomous cars that you are in panic mode and should be treated as an obstacle, not a fellow autonomous vehicle.

      You forgot: Also warn the drivers of non-automated cars. That is, activate the warning lights.

      • (Score: 0) by Anonymous Coward on Tuesday August 11 2015, @02:39AM

        by Anonymous Coward on Tuesday August 11 2015, @02:39AM (#221070)

        You edited out "...and engage emergency signals." at the end of the previous sentence.