Stories
Slash Boxes
Comments

SoylentNews is people

posted by janrinok on Wednesday July 20 2016, @02:11AM   Printer-friendly
from the probably-painted-bright-orange dept.

Germany is planning to require "black boxes" in autonomous and semi-autonomous cars:

Germany plans new legislation to require manufacturers of cars equipped with an autopilot function to install a black box to help determine responsibility in the event of an accident, transport ministry sources told Reuters on Monday. The fatal crash of a Tesla Motors Inc Model S car in its Autopilot mode has increased the pressure on industry executives and regulators to ensure that automated driving technology can be deployed safely.

Under the proposal from Transport Minister Alexander Dobrindt, drivers will not have to pay attention to traffic or concentrate on steering, but must remain seated at the wheel so they can intervene in the event of an emergency. Manufacturers will also be required to install a black box that records when the autopilot system was active, when the driver drove and when the system requested that the driver take over, according to the proposals. The draft is due to be sent to other ministries for approval this summer, a transport ministry spokesman said.

Look for the kill switch next. Also at Ars Technica.


Original Submission

 
This discussion has been archived. No new comments can be posted.
Display Options Threshold/Breakthrough Mark All as Read Mark All as Unread
The Fine Print: The following comments are owned by whoever posted them. We are not responsible for them in any way.
  • (Score: 2) by theluggage on Wednesday July 20 2016, @09:57AM

    by theluggage (1797) on Wednesday July 20 2016, @09:57AM (#377137)

    Humans can't do that. If we are told we don't need to pay attention to traffic or concentrate on steering then we won't

    This. 1000x.

    There's a quantum leap to be made between modern cruise control/automatic collision avoidance and true autonomy. Once you let the driver stop making second-by-second decisions, they will zone out, and the computer has to be able to deal with all eventualities - at least as well as a reasonable human driver.

    That means it should be legal to operate an autonomous car while drunk, asleep, texting, performing lewd acts with passengers etc. because that is exactly what people will do. That means testing with trained operators in which any human intervention counts as a failure until the incidents-per-mile figure is provably better than for a good human driver (...and that's per comparable mile - not straight-line-freeway miles with autopilot vs. all miles driven by humans). If the car is in autonomous mode then the manufacturer must be liable for any at-fault accidents - they shouldn't have the wriggle-room to blame the driver for failing to intervene.

    Or, preferably, the insurance system should be overhauled to avoid this nonsense of assigning fault with zero burden of proof: your car gets wrecked? Your insurer pays. You/your passengers get injured? Your insurer pays. You hit a pedestrian? Your insurer pays. Accident happened because of your dangerous driving? That's up to the police, the criminal justice system and the driver licensing authorities to decide whether you need to be taken off the road. Making insurance companies solely liable for the losses of their own clients should be a zero sum game for consumers & the industry as a whole: the individual insurers won't like it because they won't be able to gamble on dumping more liabilities on their competitors than their competitors dump on them, or make extra money by flogging details of "not at fault" claimants to ambulance-chasing lawyers and overpriced replacement car firms. Boo hoo.

    Starting Score:    1  point
    Karma-Bonus Modifier   +1  

    Total Score:   2