The first fatality involving Tesla's Autopilot feature led to questions over the safety of the semi-autonomous system last year, but a report published by the National Transportation Safety Board (NTSB) concludes that Elon Musk's company was not at fault. While the cause of the crash has still not been determined, the 538-page report states that driver Joshua Brown had his hands off the wheel of the Tesla Model S "for the vast majority of the trip." This was despite receiving seven visual warnings, six of which also sounded a chime, to maintain control during the 37-minute journey.
GreenCar Reports states:
The truck driver involved in the crash also claimed Brown was watching a movie at the time of impact—an aftermarket DVD player was found among the wreckage.
On the other hand, Ars Technica reports otherwise:
In the latest regulatory documents on the incident, the National Traffic Safety Board disputed some accounts that Brown was watching a Harry Potter movie during the crash last year. The board said it found several electronic devices, but there was no evidence that they were being operated during the accident.
Ars elaborates on the amount of time that the driver had his hands on the wheel:
Tesla's autopilot mode allows a vehicle to maintain the speed of traffic, and an auto-steer function is designed to help keep the Tesla inside its lane. The board said the Tesla alerted the driver seven times with a visual of "Hands Required Not Detected." The authorities said the motorist, a former Navy Seal, had his hands on the wheel for 25 seconds during the 37 minutes of the trip when they should have been placed on the steering wheel. That's according to "system performance data" from Tesla, the government said.
(Score: 2, Informative) by fustakrakich on Wednesday June 21 2017, @11:31AM (6 children)
Well, it wasn't exactly 'faultless'. The car could have slowed down and/or stopped when the driver didn't heed the warnings.
La politica e i criminali sono la stessa cosa..
(Score: 2) by wonkey_monkey on Wednesday June 21 2017, @03:30PM (3 children)
That's far more dangerous than simply continuing to drive. Not, perhaps, as dangerous as driving into a truck, but we don't know exactly why that happened yet.
systemd is Roko's Basilisk
(Score: 4, Insightful) by vux984 on Wednesday June 21 2017, @03:50PM (1 child)
Nope. Suppose the driver has fallen asleep.
That's not to say the car should pull over on a bridge or something, but common sense dictates it pull over as soon as it is somewhere safe. If it can't figure that out, its not ready to drive.
(Score: 2) by wonkey_monkey on Thursday June 22 2017, @09:04PM
So what if he has? That's not a situation the technology, in its current state, is designed to cope with (although it could potentially still cope with it better, by continuing to drive within lane markings and keeping safe distances from other vehicles, than a dumb car can).
No, it's just not ready to have its driver fall asleep at the wheel, and/or ignore all the warnings telling them they're not using the system as safety and common sense dictates.
systemd is Roko's Basilisk
(Score: 1, Insightful) by Anonymous Coward on Wednesday June 21 2017, @07:25PM
Nonsense. If a car's cruise control fails (used to have a car with a wonky cruise control myself), the vehicle comes to a stop if the driver's foot isn't on the pedal.
Everybody is taking this market babble "autopilot" and assuming that means "autonomous." Why the ever living fuck can't Tesla just give a mea culpa and admit that their marketing gave the wrong impression or what the fuck ever however weasel lawyers would put it and STOP FUCKING CALLING IT AUTOPILOT. IT'S ENHANCED FUCKING CRUISE FUCKING CONTROL WITH FUCKING LANE FUCKING ASSIST! NOTHING MORE! Fuck!
(Score: 2) by FatPhil on Thursday June 22 2017, @07:46AM (1 child)
Great minds discuss ideas; average minds discuss events; small minds discuss people; the smallest discuss themselves
(Score: 1) by fustakrakich on Thursday June 22 2017, @03:22PM
Where did I say you did?
La politica e i criminali sono la stessa cosa..