US officials to investigate Tesla Autopilot following 11 crashes
Share this article:
WASHINGTON - US safety officials have opened a preliminary investigation into Tesla's Autopilot system after identifying 11 crashes involving the driver assistance system, officials said on Monday.
The incidents, dating back to 2018, included one fatal crash and seven that resulted in injuries to 17 people, according to the National Highway Traffic Safety Administration.
The agency "is committed to ensuring the highest standards of safety on the nation's roadways," a spokesperson said, and in order to "better understand the causes of certain Tesla crashes, NHTSA is opening a preliminary evaluation into Tesla Autopilot systems."
Tesla founder Elon Musk has defended the Autopilot system and the electric carmaker warns that it requires "active driver supervision" behind the wheel.
But critics, including in Congress, say the system can be easily fooled and that the system's name gives drivers a false sense of confidence. They have called for NHTSA to take action.
Tesla did not respond to an AFP request for comment.
Testers with the magazine Consumer Reports demonstrated in a video that Autopilot could be fooled into driving with no one behind the wheel, a ploy also shown in widely-seen videos on TikTok and other social media platforms.
The crashes cited by NHTSA involved incidents in which "various Tesla models crashed" in instances where first responders were involved, including "some that crashed directly into the vehicles of first responders," the NHTSA spokesperson said.
Three of the crashes took place in California, while others happened in Florida, Texas and Massachusetts, and other states. The probe covers models Y, X, S and 3, the agency said.
"NHTSA reminds the public that no commercially available motor vehicles today are capable of driving themselves," the spokesperson said.
"Certain advanced driving assistance features can promote safety by helping drivers avoid crashes and mitigate the severity of crashes that occur, but as with all technologies and equipment on motor vehicles, drivers must use them correctly and responsibly."
News of the probe sent Tesla shares sharply lower on Monday.
Investigations such as the one announced Monday sometimes lead to recalls. In June, Tesla recalled more than 285 000 cars in China due to problems with the cruise control system that authorities there said could lead to collisions.
However, analysts said such a recall could involve a software update rather than a more hardware change that requires costly equipment upgrades.
Pushing the boundaries
Musk has a history of skirmishing with regulators, but the controversies have had little effect on Tesla's ascendance over the last year and a half as the company has hit key production targets.
His achievement in building Tesla from a fledgling start-up into a pacesetter in the electric car market stands out as a success as other electric startups like Lordstown Motors and Nikola have stumbled.
At the same time, Musk has sparked blowback from critics as he pushed or flouted the rules on everything from his use of social media to discuss Tesla's operations to his response to Covid-19 health protocols required by local authorities near the California plant.
The Centre for Auto Safety, a non-profit group, has pressed US officials since 2018 to bar the name "Autopilot," viewing the moniker as deceptive.
Jason Levine, executive director of the centre, welcomed news of the NHTSA probe, but said it should go "far beyond" crashes involving first responder vehicles "because the danger is to all drivers, passengers, and pedestrians when Autopilot is engaged," he said in an email to AFP.
"Whether Autopilot needs to be disabled, or be required to use driver monitoring systems to prevent these crashes, is a question for NHTSA," Levine said. "But there's no question that something needs to be done quickly to prevent more injuries and deaths."
Morningstar analyst Seth Goldstein said the most likely outcome of the probe will be a requirement for a software update and additional warnings about the limits of Autopilot.
"We think the incidents highlight the need for Tesla to continue to improve its autonomous software before the company is likely to see a large revenue increase from its subscription-based full self-driving software," Goldstein said in a note.
In April, Democratic senators Richard Blumenthal of Connecticut and Ed Markey of Massachusetts urged NHTSA to probe a fatal crash in Texas involving a Tesla after law enforcement said there was no driver behind the wheel.
Tesla has said it does not believe the April crash involved Autopilot.