
from the problems-with-his-connection dept.
Tesla Autopilot Crash Driver 'Was Playing Video Game'
BBC:
An Apple employee who died after his Tesla car hit a concrete barrier was playing a video game at the time of the crash, investigators believe.
The US National Transportation Safety Board (NTSB) said the car had been driving semi-autonomously using Tesla's Autopilot software.
Tesla instructs drivers to keep their hands on the wheel in Autopilot mode.
...
But critics say the "Autopilot" branding makes some drivers think the car is driving fully autonomously.The NTSB said the driver had been "over-reliant" on the software.
Tesla does instruct drivers to keep their hands on the wheel when using Autopilot, and an audible warning sounds if they fail to do so.
Does the Tesla branding of "autopilot" lure drivers into driving dangerously?
Tesla Crash Likely Caused by Video Game Distraction
The NTSB has published a review of a fatal crash involving a Tesla in March 2018 that includes a set of safety recommendations.
In the NTSB press release(2/25/2020) regarding the causes for the crash, they made the following observations:
The NTSB determined the Tesla "Autopilot" system's limitations, the driver's overreliance on the "Autopilot" and the driver's distraction – likely from a cell phone game application – caused the crash. The Tesla vehicle's ineffective monitoring of driver engagement was determined to have contributed to the crash. Systemic problems with the California Department of Transportation's repair of traffic safety hardware and the California Highway Patrol's failure to "report damage to a crash attenuator led to the Tesla striking a damaged and nonoperational crash attenuator" [sic], which the NTSB said contributed to the severity of the driver's injuries.
"This tragic crash clearly demonstrates the limitations of advanced driver assistance systems available to consumers today," said NTSB Chairman Robert Sumwalt. "There is not a vehicle currently available to US consumers that is self-driving. Period. Every vehicle sold to US consumers still requires the driver to be actively engaged in the driving task, even when advanced driver assistance systems are activated. If you are selling a car with an advanced driver assistance system, you're not selling a self-driving car. If you are driving a car with an advanced driver assistance system, you don't own a self-driving car," said Sumwalt.
"In this crash we saw an over-reliance on technology, we saw distraction, we saw a lack of policy prohibiting cell phone use while driving, and we saw infrastructure failures that, when combined, led to this tragic loss. The lessons learned from this investigation are as much about people as they are about the limitations of emerging technologies," said Sumwalt. "Crashes like this one, and thousands more that happen every year due to distraction, are why "Eliminate Distractions" remains on the NTSB's Most Wanted List of Transportation Safety Improvements," he said.
[...] the board also excoriated the National Highway Transportation Agency for providing utterly ineffectual oversight when it comes to so-called "level 2" driver assists, as well as California's highway agency CalTrans, which failed to replace a damaged crash attenuator in front of the concrete gore, which would in most likelihood have saved Huang's life.
Previously:
NTSB Releases Preliminary Report on Tesla Autopilot Crash
Tesla Crash: Model X Was In Autopilot Mode, Firm Says
Related Stories
Tesla Model X driver dies in Mountain View crash
Submitted via IRC for Fnord666
The driver of a Tesla Model X has died following a highway crash in Mountain View, leaving a number of safety questions.
Source: https://www.engadget.com/2018/03/24/tesla-model-x-driver-dies-in-mountain-view-crash/
Tesla Crash: Model X Was In Autopilot Mode, Firm Says
In a post on its website, the electric-car maker said computer logs retrieved from the wrecked SUV show that Tesla's driver-assisting Autopilot technology was engaged and that the driver doesn't appear to have grabbed the steering wheel in the seconds before the crash.
The car's 38-year-old driver died after the vehicle hit a concrete lane divider on a Northern California freeway and caught fire. The accident happened March 23.
[...] In its Friday post, Tesla said the crashed Model X's computer logs show that the driver's hands weren't detected on the steering wheel for 6 seconds prior to the accident. It said they also show the driver had "about five seconds and 150 meters of unobstructed view of the concrete divider" before the crash but that "no action was taken."
The company cited various statistics in defending Autopilot in the post and said there's no doubt the technology makes vehicles safer than traditional cars.
"Over a year ago," the post said, "our first iteration of Autopilot was found by the US government to reduce crash rates by as much as 40 percent. Internal data confirms that recent updates to Autopilot have improved system reliability."
"Tesla Autopilot does not prevent all accidents -- such a standard would be impossible -- but it makes them much less likely to occur," the post reads. "It unequivocally makes the world safer for the vehicle occupants, pedestrians and cyclists."
Submitted via IRC for Runaway1956
Tesla fatal crash: 'autopilot' mode sped up car before driver killed, report finds
A Tesla driving in "autopilot" mode crashed in March when the vehicle sped up and steered into a concrete barrier, according to a new report on the fatal collision, raising fresh concerns about Elon Musk's technology.
The National Transportation Safety Board (NTSB) said that four seconds before the 23 March crash on a highway in Silicon Valley, which killed Walter Huang, 38, the car stopped following the path of a vehicle in front of it. Three seconds before the impact, it sped up from 62mph to 70.8mph, and the car did not brake or steer away, the NTSB said.
[...] The NTSB report [...] has once again raised serious safety questions about the limits and performance of the autopilot technology, which is meant to assist drivers and has faced growing scrutiny from experts and regulators. Mark Fong, an attorney for Huang's family, also said the report appeared to "contradict Tesla's characterization" of the collision.
The NTSB press release includes this link to the preliminary report, for anyone inclined to read the slightly longer version of events.
The Mountain View Fire Department applied about 200 gallons of water and foam to extinguish the post-crash fire. The battery reignited five days after the crash in an impound lot and was extinguished by the San Mateo Fire Department.
Tesla has yet another federal headache to contend with. On March 4, the National Highway Traffic Safety Administration's Office of Defects Investigation opened a preliminary investigation after two reports of Tesla Model Y steering wheels detaching in drivers' hands while driving.
NHTSA's ODI says that in both cases, the model year 2023 Model Ys each required repairs on the production line that involved removing their steering wheels. The wheels were refitted but were only held in place by friction—Tesla workers never replaced the retaining bolt that affixes the steering wheel to the steering column. In 2018, Ford had to recall more than 1.3 million vehicles after an incorrectly sized bolt resulted in a similar problem.
The ODI document states that "sudden separation occurred when the force exerted on the steering wheel overcame the resistance of the friction fit while the vehicles were in motion" and that both incidents occurred while the electric vehicles still had low mileage.
Related:
Tesla recalls all cars with FSD (full self driving) option (Elon Tweet:"Definitely. The word "recall" for an over-the-air software update is anachronistic and just flat wrong!")
Feds Open Criminal Investigation Into Tesla Autopilot Claims
NHTSA Investigation Into Telsa Autopilot Intensifies
Tesla's Radar-less Cars Investigated by NHTSA After Complaints Spike
Tesla Under Federal Investigation Over Video Games That Drivers Can Play
Tesla Must Tell NHTSA How Autopilot Sees Emergency Vehicles
NHTSA Opens Investigation into Tesla Autopilot after Crashes with Parked Emergency Vehicles
Tesla Recall is Due to Failing Flash Memory
Tesla Crash Likely Caused by Video Game Distraction
Autopilot Was Engaged In The Crash Of A Tesla Model S Into A Firetruck In LA, NTSB Says
Tesla to Update Battery Software after Recent Car Fires
Tesla Facing Criminal Probe
Former Tesla Employee's Lawyer Claims His Client Was Effectively "SWATted"
NHTSA Finishes Investigation, Declares Tesla Has No Fault in Deadly Crash
Tesla Says Autopilot System Not to Blame for Dutch Crash
(Score: 5, Funny) by mmh on Wednesday February 26 2020, @09:17PM (10 children)
Jack Thompson was right, video games do kill people!
(Score: 5, Funny) by Anonymous Coward on Wednesday February 26 2020, @09:18PM (6 children)
Candy CRUSHED
(Score: 1, Funny) by Ethanol-fueled on Wednesday February 26 2020, @09:26PM (5 children)
I LOLd.
Was thinking something like Angry Birds myself. Something stupid that a Tesla-buying yuppie might play.
(Score: 5, Funny) by takyon on Wednesday February 26 2020, @09:54PM
Faceplants vs. Zombies: Concrete Season Passed.
[SIG] 10/28/2017: Soylent Upgrade v14 [soylentnews.org]
(Score: 0) by Anonymous Coward on Wednesday February 26 2020, @10:26PM (2 children)
Maybe you should check out some Saudi Arabian dating apps, they seem more your level. If you could actually get laid you might chill the fuck out from your driving need to be a little prick.
(Score: 4, Touché) by Ethanol-fueled on Wednesday February 26 2020, @11:02PM (1 child)
Found the Tesla owner!
(Score: 0) by Anonymous Coward on Thursday February 27 2020, @09:41AM
Found the witty remark!
(Score: 2) by Bot on Friday February 28 2020, @03:43PM
Plot twist: the videogame was a driving sim.
Account abandoned.
(Score: 2) by DannyB on Wednesday February 26 2020, @09:53PM
Jack Thompson was confused.
Video games don't kill people. Auto Pilots kill people.
It's just that Jack Thompson was too preoccupied with inflating the auto pilot.
Fact: We get heavier as we age due to more information in our heads. When no more will fit it accumulates as fat.
(Score: 2) by Bot on Friday February 28 2020, @03:46PM (1 child)
I can't believe nobody died while catching pokemon go.
I thought the slogan of Pokemon go were: gotta catch 'em aaaaaaaaaaaaaaaaaaaaaaaaghhhh
Account abandoned.
(Score: 2) by Booga1 on Friday February 28 2020, @10:14PM
Oh, there have been a few accidents and deaths with Pokemon Go.
Driver playing Pokemon Go runs over two pedestrians, killing one: https://www.gamespot.com/articles/woman-killed-by-pokemon-go-playing-driver-police-s/1100-6442956/ [gamespot.com]
Driver playing Pokemon Go runs over bicyclist(dead): https://www.gamespot.com/articles/another-death-linked-to-pokemon-go-in-japan/1100-6443046/ [gamespot.com]
Two men fall over a cliff(survived): https://www.gamespot.com/articles/captivated-by-pokemon-go-two-men-fall-down-a-90-fo/1100-6441825/ [gamespot.com]
Two players(17 and 18 years old) wander into the wrong neighborhood and get gunned down(one dead): https://www.news.com.au/technology/home-entertainment/gaming/apps/pokemon-gos-first-death-as-warning-issued-about-landmines/news-story/e29caec2a2170721b657bae6a671b118 [news.com.au]
New York driver crashes into tree while playing the game: https://www.gamespot.com/articles/car-crash-attributed-in-part-to-pokemon-go/1100-6441790/ [gamespot.com]
15 year old gets hit crossing highway during rush hour: https://www.gamespot.com/articles/15-year-old-hit-by-car-after-playing-pokemon-go/1100-6441810/ [gamespot.com]
I'm sure there are more.
(Score: 0) by Anonymous Coward on Wednesday February 26 2020, @09:21PM (20 children)
Recognize that the percentage of drivers who will totally ignore the warnings about autopilot, and therefore operate the system unsafely, is large enough to negate any possible safety effects of having the computer in control. Thus such systems should be disabled until they can be proven that they will never get the driver into an accident that a human would easily avoid.
(Score: 5, Insightful) by PartTimeZombie on Wednesday February 26 2020, @09:28PM (3 children)
I see people in non-Tesla cars every day texting while they drive, among other stupid things, so yes.
It is hard to legislate for stupidity.
(Score: 0) by Anonymous Coward on Wednesday February 26 2020, @10:29PM (2 children)
True but one can do things to help not facilitate it, like give people reason to not pay attention to the road with Autopilots.
(Score: 2) by MostCynical on Thursday February 27 2020, @01:58AM (1 child)
Or radios, or makeup, or children, or gps, or phones...
Humans are stupid.
We kill and maim millions every year with motor vehicles.
Self -driving cars are already better drivers than humans, but most humans think they are far better than average drivers.
so, stupid and delusional.
"I guess once you start doubting, there's no end to it." -Batou, Ghost in the Shell: Stand Alone Complex
(Score: 1, Insightful) by Anonymous Coward on Thursday February 27 2020, @03:36AM
Citation needed, buddy.
We don't have enough data to make that determination yet.
(Score: 1) by Ethanol-fueled on Wednesday February 26 2020, @09:29PM (6 children)
Eh, in the meantime the Tesla badge on a car is a marker to stay the hell away and follow from a distance in another lane. Though speaking of, I have even worse news for you all: BMW is also working on their own self-driving car. Soon every day on the highways will be Death Race 2000. [wikipedia.org]
(Score: 1, Insightful) by Anonymous Coward on Wednesday February 26 2020, @09:35PM
This -- after the Tesla hit the (previously collapsed) barrier, two more cars followed it in and were somewhat damaged (limited injuries).
(Score: 1, Informative) by Anonymous Coward on Wednesday February 26 2020, @09:38PM (3 children)
Eh, in the meantime the Tesla badge on a car is a marker to stay the hell away and follow from a distance in another lane.
Yeah, the whole highway to yourself. I play a similar trick when I talk to myself on the bus so nobody will sit next to me.
(Score: 2) by barbara hudson on Wednesday February 26 2020, @09:46PM (2 children)
Doesn't work any more - everyone is "talking to themselves" when using bluetooth on their phones. You need to ask any interlopers if the voices in your head are too loud for them. Or ask them if they want to talk about the WatchTower.
SoylentNews is social media. Says so right in the slogan. Soylentnews is people, not tech.
(Score: 4, Funny) by sjames on Wednesday February 26 2020, @09:59PM (1 child)
Just give them a goofy grin and declare "I'm wearing new socks!".
(Score: 2) by barbara hudson on Wednesday February 26 2020, @10:37PM
... for extra effectiveness, say it while wearing sandals and no socks. Then say "you'll never guess WHERE I'm wearing them."
Might be fun to make videos of people doing stuff to encourage people to get up and move.
SoylentNews is social media. Says so right in the slogan. Soylentnews is people, not tech.
(Score: 1, Informative) by Anonymous Coward on Wednesday February 26 2020, @10:28PM
Do the theoretical practice now. [sjgames.com]
(Score: 5, Informative) by Booga1 on Wednesday February 26 2020, @09:37PM (6 children)
Well, you can rule that out in this situation. From a previous report on this(linked):
That particular spot has had multiple accidents over the years and Caltrans sometimes takes weeks to repair the barrier. If I remember correctly, it's a left hand exit. That's something that's pretty rare in the US and it throws people off all the time.
Locally, I've seen people get angry at solo drivers moving into the carpool lane to take one of the ones around here. It is not a carpool restricted exit, and the lines become dashes to allow anyone to use the exit. It doesn't help. The carpool drivers get road rage angry to the point they'll tailgate the car in front by inches just to deny the solo driver a chance to take the exit.
I'm not a fan of autonomous driving, but at least the systems are only stupid instead of malicious.
(Score: 2, Informative) by Anonymous Coward on Wednesday February 26 2020, @09:50PM
Another report (can't find it just now) notes that the same Tesla had previously veered left at the same location. The driver had mentioned to friends/family that he had to intervene to keep the car in the lane at that left exit... So the driver already knew that Autopilot had trouble with that road/lane configuration. And yet, wasn't paying attention when he got there.
20-20 hindsight -- the Tesla could have noted that driver intervention was needed at that location and turned off Autopilot (with suitable warnings) some distance in advance, forcing manual control. But no, the Autopilot did the same thing again.
In a sense it's a little surprising that more Teslas didn't follow this one in to the same "trap", before the next software update.
(Score: 4, Interesting) by Snotnose on Thursday February 27 2020, @01:11AM (4 children)
Quite frankly I don't give a shit. The dead asshole was in the driver's seat playing a video game, instead of paying attention. Don't care why the road was damaged there. Don't care that it was a known issue for Tesla's auto-drive. Don't care that Caltrans took their own sweet time fixing something that got broken in an earlier accident.
Had this asshole been watching the road he would be alive today to tweet about Tesla's bad auto pilot. Instead, it was more important to him to farm corn in farmville, or whatever.
Fuck him. Better he kills himself running into concrete than killing a family tooling along in their SUV, with the driver actually, ya'know, driving.
Is anyone surprised ChatGPT got replaced by an A.I.?
(Score: 2) by Booga1 on Thursday February 27 2020, @01:34AM (3 children)
The crash may very well have taken others out had things turned out a little differently. This was a multi-car accident.
From the article:
(Score: 4, Informative) by Booga1 on Thursday February 27 2020, @01:38AM (2 children)
The photo from the article [ntsb.gov] really puts it into perspective how close it could have been to a multiple fatality accident.
(Score: 0) by Anonymous Coward on Thursday February 27 2020, @04:10PM (1 child)
thanks for link.
i was just thinking why that concret block is square and not sloped.
my guess is that a sloped concret block (front side to enemy) would FLIP a car beautifully.
a flipped car doesn't travel very far and will stop from friction. furthermore, once flipped there's still enough "crumble zone" to protect passenger(s).
anyways, if you want to make the divider even more efficient at killing people, i recommend forming a big honking iron into a vertical cleave ^_^
(Score: 2) by Booga1 on Thursday February 27 2020, @04:28PM
Can't pick between a solid block and an angled ramp? I see you're willing to split the difference!
(Score: 2) by sjames on Wednesday February 26 2020, @09:57PM
Or recognize that sometimes with or without autopilot, people will do dumb things against all warnings and advice. Do we disable all cars or do we give appropriate warnings and hope they don't do something stupid? (such as playing a video game while driving in spite of audible and visual warnings to pay attention to the road)
(Score: 0) by Anonymous Coward on Wednesday February 26 2020, @10:23PM
You mean like this guy [cnet.com]?
(Score: 2) by Gaaark on Wednesday February 26 2020, @09:27PM (2 children)
Take them by the hand, and they'll rip it away and run in front of the bus.
Stupid is as....
--- Please remind me if I haven't been civil to you: I'm channeling MDC. ---Gaaark 2.0 ---
(Score: 0) by Anonymous Coward on Wednesday February 26 2020, @09:33PM (1 child)
Corollary?
Apple hires stupid people for game development? I think there is more to it than just that.
For example, it's rare to really internalize risk properly. The highway system may go a really long way (million miles?) between accidents. But if a user of Autopilot (or other assist system) sees that it's doing ok after a few weeks, or a few months, they start to assume it's as good as the system.
Another possibility is the built-in addictive ability of gaming, it's so good that even a game programmer is sucked in and needs that constant fix.
(Score: 2) by barbara hudson on Wednesday February 26 2020, @09:49PM
Apple makes games?
Who knew?
Who cares?
SoylentNews is social media. Says so right in the slogan. Soylentnews is people, not tech.
(Score: 2, Informative) by Anonymous Coward on Wednesday February 26 2020, @09:27PM
This news page has a photo of the barrier that was previously hit (that driver was more-or-less OK), but the barrier wasn't repaired/reset in time for the Tesla,
https://ktla.com/news/local-news/ntsb-says-california-officials-failed-to-fix-highway-barrier-before-deadly-tesla-crash-in-bay-area/ [ktla.com]
Article was posted last fall.
(Score: 0) by Anonymous Coward on Wednesday February 26 2020, @09:28PM (2 children)
I thought this was going to be about a Tesla crashing and distracting someone from their videogame, so they lost on the last level or something.
(Score: 0) by Anonymous Coward on Wednesday February 26 2020, @09:38PM
Oh, he lost all right. Big time, the whole enchilada.
Not sure if the phone logs revealed what game level he was on before the accident. Depending on your religious beliefs, he may be on to the next level now.
(Score: 0) by Anonymous Coward on Wednesday February 26 2020, @09:48PM
My read was “running video game on Tesla’s computer. And crashed crashed the car”
Would make a great gaming platform. But only in PARK
(Score: 1, Insightful) by Anonymous Coward on Wednesday February 26 2020, @09:41PM (15 children)
The driver expects the car to drive itself. The car can't actually drive itself safely, and requires babysitting.
It's a terrible feature as it stands. You might as well just drive if the car instead if you still have to look over and try and deal with whatever the hell the car is doing.
(Score: 3, Insightful) by sjames on Wednesday February 26 2020, @10:09PM
The car constantly bongs and warns him that he needs to pay attention and he ignores it...
(Score: 5, Interesting) by KilroySmith on Wednesday February 26 2020, @10:18PM (2 children)
Actually, it's an excellent feature if used intelligently. Sadly, Mr. Huang, despite being an engineer, was not using the system intelligently. IMHO, continuing to use AP as he did despite having multiple instances of it working badly indicates to me that he had a blatant disregard for his own safety.
I have a Model 3, and use AP extensively. I've also noted that it's close to being able to drive autonomously on freeways, but it's not there yet. It occasionally does rude things, occasionally does stupid things, although I haven't noted it doing dangerous things yet - but I read the news and know of the kinds of dangerous things it can do. As a result, I drive with my hand on the wheel and my eyes outside the car. It makes driving less stressful once you realize that it's a driver assist and not a driver replacement; I'm operating at a more executive level of watching what's going on around me rather than having to pay attention to whether I'm centered in the lane, driving an appropriate speed, or following at a safe distance - the car is excellent at those tasks.
Maybe this situation shows why we can't have nice things, why nothing should be sold unless it's idiot proof. Unfortunately, the options are currently less savory - the September 11, 2001 terror attacks in the USA that forever changed our society killed fewer people than vehicles do in a normal month in the USA. Autonomous driving systems, even flawed ones, can drastically reduce those deaths and injuries.
(Score: 2, Insightful) by Anonymous Coward on Thursday February 27 2020, @03:32AM (1 child)
Jesus, driving isn't that hard, especially if you have an automatic transmission, which everybody in the US does (all except 5 people).
You can drive while thinking of other things, even. You are telling me you need an "assist" in the form of the criminally misnamed Auto Pilot to help you drive? You are going senile if that is the case. Look, Tesla's Auto Pilot is the worst possible implementation: you have to pay attention and have hands ready as if you were driving, while not driving, in case on a second's notice you HAVE to drive. That's more cognitive load and context switching than simply driving! At least if you plan on not crashing...
(Score: 0) by Anonymous Coward on Thursday February 27 2020, @04:27AM
> (all except 5 people)
Hey, you didn't count me, there are 6 manual transmission users left. So there!
This https://www.chicagotribune.com/autos/sc-auto-cover-manual-transmissions-20180710-story.html [chicagotribune.com] claims the take rate in USA was 2% in 2018.
(Score: 5, Interesting) by darkfeline on Wednesday February 26 2020, @11:27PM (10 children)
It's quite interesting where this misconception about autopilot could have come from.
Do passengers expect that airplane autopilot means that their human pilot gets to take a nap or play hanky panky? Why then do they expect to be able to do the same with car autopilot?
Maybe it's because of the marketing, but I don't know anything about that since I've never seen a Tesla ad.
Join the SDF Public Access UNIX System today!
(Score: 0, Insightful) by Anonymous Coward on Thursday February 27 2020, @12:50AM (5 children)
Apparently, your English isn't very good. Autopilot literally means selfpilot. The fact that the aviation industry has a different definition doesn't change the meaning of the components of the word. That's what normal people expect out of an autopilot, the ability to drive a segment without intervention.
(Score: 0) by Anonymous Coward on Thursday February 27 2020, @07:04AM (2 children)
So does that mean automobiles will normally just start moving by themselves? ;)
Without intervention does not mean you can safely play a video game or go to sleep or leave the pilot seat while it's doing so.
See also: https://en.wikipedia.org/wiki/Autopilot#First_autopilots [wikipedia.org]
(Score: 2) by maxwell demon on Thursday February 27 2020, @07:08PM
Automobiles move themselves (before, you had to have a horse doing the moving). There is no "starting" in "automobile" (sorry, I'm too lazy to find out what "starting" is in Latin).
The Tao of math: The numbers you can count are not the real numbers.
(Score: 0) by Anonymous Coward on Thursday February 27 2020, @08:34PM
As the other poster said, automobiles move themselves once started and put into gear. The steering and specific speed are controlled by the driver. Auto means self, mobile means move, so they are self moving.
If you have to count on overriding the common use of a word in reference to a mass market product, bad things are likely to happen. It's one of the reasons for weird invented terms for features.
(Score: 2) by darkfeline on Thursday February 27 2020, @09:52PM
Autopilot does have the ability to drive a segment without intervention, right into the concrete barrier.
Human pilots make mistakes too, I don't see why selfpilots are exempt from that.
Join the SDF Public Access UNIX System today!
(Score: 2) by sjames on Friday February 28 2020, @08:07PM
And hot water heater means a device that further heats already hot water but people routinely supply it with cold water because they know what is actually meant.
(Score: 2) by maxwell demon on Thursday February 27 2020, @01:43PM (2 children)
Yes. Well, not exactly, they expect the pilots to still handle things like communications with towers, but they definitely wouldn't expect that the pilot has to care much about what the plane does.
The Tao of math: The numbers you can count are not the real numbers.
(Score: 0) by Anonymous Coward on Thursday February 27 2020, @05:21PM (1 child)
You have the advantage in the air that everyone else is following a track as well, and there aren't any trees, gas stations, or pedestrians up at 40,000 feet. You tend to just hit air, which is kind of soft.
(Score: 0) by Anonymous Coward on Thursday February 27 2020, @08:36PM
Then, don't call it autopilot until it can self pilot reliably. Problem solved.
(Score: 2) by sjames on Friday February 28 2020, @08:22PM
The Tesla autopilot actually does more than a jumbo jet's autopilot does. The aircraft version doesn't include collision avoidance, that's the pilot's job.
(Score: 1, Funny) by Anonymous Coward on Wednesday February 26 2020, @09:43PM (2 children)
And Teslas response... He wasn't holding it right
(Score: 0) by Anonymous Coward on Wednesday February 26 2020, @10:22PM (1 child)
Has it gotten to the point where "holding it wrong" is more important for the phone than the dick?
(Score: 0) by Anonymous Coward on Wednesday February 26 2020, @11:34PM
I would guess iPhone users stroke both the same way.
(Score: 1) by bobmorning on Wednesday February 26 2020, @10:53PM (2 children)
The gene pool just got a slight improvement.
(Score: 0) by Anonymous Coward on Thursday February 27 2020, @02:35AM
That is some dumb shit, weren't you defending Mike the Rocket Man?
(Score: 0) by Anonymous Coward on Thursday February 27 2020, @03:41AM
But the automotive technology gene pool didn't get any better. This shit is faulty and does not deliver as promised, but people will keep buying it. A crash will not stop it from reproducing.
(Score: 0) by Anonymous Coward on Thursday February 27 2020, @01:27AM
is that it didn't make a normal crash sound but that pacman sound when you lose a life.
(Score: 2) by maxwell demon on Thursday February 27 2020, @01:46PM
An autopilot should never get distracted by a video game! :-)
The Tao of math: The numbers you can count are not the real numbers.
(Score: 2) by Phoenix666 on Thursday February 27 2020, @09:52PM (1 child)
Wait, was he playing Frogger? Because that would be choice.
Washington DC delenda est.
(Score: 2) by sjames on Friday February 28 2020, @08:25PM
I have often thoughty that the audible crosswalk signals should use the Frogger SPLAT sound to signal don't walk.