████ # This file was generated bot-o-matically! Edit at your own risk. ████
SAN FRANCISCO — Tesla began letting owners request its “Full Self-Driving” software early Saturday, opening up for wide release its most advanced driver-assistance suite and signaling thousands of drivers will soon beon the road with the unregulated and largely untested features.
It’s the first time the company has let typical owners upgrade to the software it terms self-driving, although the name itself is an exaggeration by industry and regulatory standards. Tesla chief executive Elon Musk had said owners would be able to request this weekendthe upgraded suite of advanced driver-assistance features, which Tesla says is a beta, although they wouldn’t receive the capabilities right away.
Owners will have to agree to let Tesla monitor their driving behavior through the company insurance calculator. Tesla issued [tesla.com] a detailed guide specifying the criteria under which they would be graded. If their driving is deemed to be “good” over a seven day period, Musk said on Twitter, “beta access will be granted.”
It’s the latest twist in a saga that has regulators, safety advocates and family of Tesla crash victims up in arms because of the potential for chaos as the technology is unleashed on real-world roads. Until now, roughly 2,000 beta testers have had access to the technology.
This weekend’s release would make it available to those who have purchased the now-$10,000 software upgrade, and those who have purchased a subscription [twitter.com] from Tesla for about [tesla.com] $100 to $200 per month — if they can first pass Tesla’s safety monitoring.
As recently as July, Musk said the technology was a “debatable” proposition, arguing, that “we need to make Full Self-Driving work in order for it to be a compelling value proposition.”
And already, investigators are looking at its predecessor, dubbed Autopilot. That navigates vehicles from highway on-ramp to off-ramp, can park and summon cars, with a driver monitoring the software. The National Highway Traffic Safety Administration opened an investigation last month into around a dozen crashes involving parked emergency vehicles while Autopilot was engaged.
“Full Self-Driving” expands Autopilot’s capabilities to city streets and offers the ability to navigate the vehicle turn-by-turn, from point A to point B.
Tesla and NHTSA did not immediately respond to requests for comment. Tesla has repeatedly [tesla.com] argued that Autopilot is safer than cars in manual driving when the modes are compared using Tesla data and information from NHTSA.
Musk has said [twitter.com] “Autopilot is unequivocally safer” than typical cars. The data is not directly comparable, however, because Autopilot is supposed [tesla.com] to be activated on certain types of roads in conditions where it can function properly.
Tesla’s move to rapidly roll out the features to large numbers of users is drawing criticism from regulators and industry peers who say it is taking a hasty approach to an issue that requires careful study and an emphasis on safety.
Despite its name, the new software does not qualify as “self-driving” under criteria set by the auto industry or safety regulators, and drivers should always pay attention while it is activated.
“I do think that their product is misleading and overall leads to further misuse and abuse,” said National Transportation Safety Board chair Jennifer Homendy, before turning to Musk himself. “I’d just ask him to prioritize safety as much as he prioritizes innovation and new technologies … safety is just as important, if not more important, than the development of the technology itself.”
As for the evaluation period for drivers who want to sign up, Tesla posted [tesla.com] its “safety score” system on its website shortly before the button’s release. It said drivers would be scored on a 0 to 100 criteria, with most receiving 80 or above. Drivers will be assessed on five factors, it said: forward collision warnings per 1,000 miles, instances of hard braking, aggressive turning, unsafe following and forced disengagements of the Autopilot system. Tesla would then use a formula to calculate their score.
“These are combined to estimate the likelihood that your driving could result in a future collision,” Tesla wrote. It was not immediately clear what score would qualify as “good” — as characterized by Musk — in order to receive Full Self-Driving.
Musk had earlier said drivers who make frequent use of the company’s Autopilot software will be rated favorably [twitter.com]. Owners will be able to track their progress in real-time, he said, and will be guided on how they can satisfy the requirements.
Late last month, industry group Chamber of Progress took aim at Tesla’s marketing of the technology.
Tesla’s cars “aren’t actually fully self-driving,” wrote [medium.com] the group, which is supported by Apple, Alphabet-owned Waymo and General Motors-backed Cruise. “The underlying issue here is that in case after case, Tesla’s drivers take their eyes off the road because they believe they are in a self-driving car. They aren’t.”
Homendy, the NTSB chair, said Tesla has not shown an active interest in improving the safety of its products. She said the board has made recommendations stemming from fatal crashes in Williston and Delray Beach, Fla., as well as Mountain View, Calif., but they have gone unanswered.
“Tesla has not responded to any of our requests,” she said. “From our standpoint they’ve ignored us — they haven’t responded to us.”
“And if those are not addressed and you’re making additional upgrades, that’s a problem,” she added.
Following an investigation into a 2018 crash that killed a driver when his vehicle slammed into a highway barrier, the safety board called on NHTSA to evaluate whether Tesla’s systems posed an unreasonable safety risk.
Homendy said NHTSA needs to take a more active role in the matter. The agency recently began requiring reporting on all crashes involving driver-assistance systems.
“It is incumbent on a federal regulator to take action and ensure public safety,” Homendy said. “I am happy that they’ve asked for crash information from all manufacturers and they’re taking an initial step with Tesla on asking for crash information on emergency vehicles. But they need to do more.”
On Twitter, a steady stream of videos from early beta tests have depicted the still nascent ‘Full Self-Driving’ system’s confusion at new obstacles. The system has been shown struggling with roundabouts and unprotected left turns, abruptly [twitter.com] veering toward pedestrians, and crossing [twitter.com] a double-yellow line into oncoming traffic.
In the latter case, the user wrote: “I want the best for Tesla, but going wide release is not the move, not right now at least.”
Others said they have suffered personally from Tesla’s rapid deployment of its software, and urged the company to reconsider.
Bernadette Saint Jean’s husband, Jean Louis, was killed in July on the Long Island Expressway when a Tesla believed to be using automated features struck him on the side of the road, a crash being investigated by NHTSA.
“Tesla should not be expanding its Autopilot or Traffic-Aware Cruise Control Systems until they can tell me why my husband and all of those First Responders had to die and be injured,” said Saint Jean, of Queens, in a statement through her attorney, Joshua Brian Irwin.
Today's HeadlinesThe most important news stories of the day, curated by Post editors and delivered every morning.