Stories
Slash Boxes
Comments

SoylentNews is people

posted by n1 on Wednesday August 20 2014, @02:51PM   Printer-friendly
from the avoiding-accidents-is-dangerous-driving dept.

BBC reports that according to Dmitri Dolgov, lead software engineer for Google's driverless car project, Google's self-driving cars are programmed to exceed speed limits by up to 10 mph when surrounding vehicles are breaking the speed limit, because going more slowly could actually present a danger. In many countries, including the United States, the speed limit is a rather nebulous thing. It's posted, but on many roads hardly anybody obeys it.

Almost every driver speeds regularly, and anybody going at or below the limit on a clear road outside the right lane is typically an obstruction to traffic—they will find themselves being tailgated or passed at high speed on the left and right. A ticket for going 1 mph over the limit is an extremely rare thing and usually signals a cop with another agenda or a special day of zero-tolerance enforcement. In fact, many drivers feel safe from tickets up to about 9 mph over the limit. Tickets happen there, but the major penalties require going faster, and most police like to go after that one weaving, racing guy who thinks the limit does not apply to him. Commenting on Google self-drive cars' ability to exceed the speed limit, a Department for Transport spokesman said: "There are no plans to change speed limits, which will still apply to driverless cars".

 
This discussion has been archived. No new comments can be posted.
Display Options Threshold/Breakthrough Mark All as Read Mark All as Unread
The Fine Print: The following comments are owned by whoever posted them. We are not responsible for them in any way.
  • (Score: 1) by Wootery on Wednesday August 20 2014, @11:17PM

    by Wootery (2341) on Wednesday August 20 2014, @11:17PM (#83746)

    who SHOULD be responsible when a self-driving car commits a traffic violation? The owner, even if they're not present? The passenger, even if they're not in control of the car at the time? The company that made the car?

    Well, not the owner, obviously, any more than you would be guilty of murder if I killed someone with your crowbar.

    If the car is totally without human driving override, then clearly the responsibility is will the organisation that built the car. If a dangerously faulty aircraft autopilot causes trouble, we blame the organisation that made the autopilot.

    The person in the car might be able to reconfigure the car's destination, but suppose having it change speed were entirely out of their control. If that's the case, they really are just a passenger, the same way they'd be a passenger if a taxi driver were at the wheel.

    If there is manual override, I'd say the responsibility is shared, but it's less clear-cut. There's no taxi driver analogy there.

  • (Score: 2) by MrGuy on Wednesday August 20 2014, @11:40PM

    by MrGuy (1007) on Wednesday August 20 2014, @11:40PM (#83750)

    Well, not the owner, obviously, any more than you would be guilty of murder if I killed someone with your crowbar.

    And hence my original post comparing the situation to red light cameras, which take PRECISELY this legal theory - the owner of the car is presumed guilty of crimes committed by a driver of the car, regardless of whether they were the driver or not.

    And if you think the government is going to prosecute Google (or any other car manufacturer) for manslaughter, you're dreaming. Heck, it's not even really possible - who would go to jail if you DID convict Google of "murder."

    We're very close to Asimovian territory here, where there's going to need to be a whole lotta law and a whole lotta precedent regarding criminal law and autonomous devices (self driving cars, piloted and unpiloted drones, etc.)

    I'm betting on any "crimes" being committed by autonomous devices get classified as "industrial accidents." And trial lawyers are going to have a field day with the notion of joint and several liability for civil damages - you only have to be a little negligent as the maker of an autonomous car to get sued. Then the big car manufacturers whine to congress, and there's a big shield law passed, provided the autonomous cars pass some notional standard set by an NHTS-like body (which the manufacturers can continually lobby to lower their standards). And on and on we go...

    • (Score: 1) by Wootery on Thursday August 28 2014, @03:29PM

      by Wootery (2341) on Thursday August 28 2014, @03:29PM (#86757)

      Agree - this isn't so different to those 'old' issues that it's likely to be treated differently.