El Reg reports
[January 23] a Tesla Model S slammed into a stationary firetruck at around 65mph on Interstate 405 in Culver City, California. The car was driven under the fire engine, although the driver was able to walk away from the crash uninjured and refused an offer of medical treatment.
The motorist claimed the Model S was driving with Autopilot enabled when it crammed itself under the truck. Autopilot is Tesla's super-cruise-control system. It's not a fully autonomous driving system.
[...] The fire truck was parked in the carshare lane of the road with its lights flashing. None of the fire crew were hurt, although Powell noted that if his team had been in their usual position at the back of the truck then there "probably would not have been a very good outcome."
Tesla will no doubt be going over the car's computer logs to determine exactly what happened, something the California Highway Patrol will also be interested in. If this was a case of the driver sticking on Autopilot, and forgetting their responsibility to watch the road ahead it wouldn't be the first time.
In 2016, a driver was killed after both he and the Tesla systems missed a lorry pulling across the highway. A subsequent investigation by the US National Transportation Safety Board found the driver was speeding and had been warned by the car six times to keep his hands on the wheel.
Tesla has since beefed up the alerts the car will give a driver if it feels they aren't paying full attention to the road. The safety board did note in its report that the introduction of Tesla's Autosteer software had cut collisions by 40 per cent.
Previous: Tesla's Semiautonomous System Contributed to Fatal Crash
(Score: -1, Troll) by Anonymous Coward on Thursday January 25 2018, @03:23AM (3 children)
Seems pretty obvious to me, no one trained the "AI" to deal with an emergency vehicle stopped in the HOV (carpool/hybrid/EV) lane. Just another example of Tesla using its customers for beta testing, nothing to see here, move along please...
(Score: 5, Informative) by Anonymous Coward on Thursday January 25 2018, @01:00PM (2 children)
Update from Wired -- https://www.wired.com/story/tesla-autopilot-why-crash-radar/ [wired.com]
Looks like parent had a point, even if made in a crude way.
Wired story has more explanation on the design tradeoffs involved, and why Lidar is needed (because, unlike radar, it can distinguish between road furniture (signs, etc) and an actual obstacle.
(Score: 2, Interesting) by tftp on Thursday January 25 2018, @07:19PM
(Score: 3, Informative) by gawdonblue on Thursday January 25 2018, @11:25PM
I was a passenger in one of these adaptive cruise control cars yesterday when both the car in front made and the car I was in made a turn and they very nearly collided. Basically the car ahead slowed to take the turn and so the car I was in automatically slowed to keep the distance, but when that car disappeared just around the corner the car I was in accelerated into the "clear" space and our driver had to brake very heavily to make the turn and then steer to avoid the slower car in front. It was a little bit scary.