Damon Lavrinc is a freelance writer and industrial design student focused on the future of transportation. A former driving instructor and communications professional, Damon is the co-founder of the Autonocast and led transportation technology coverage at WIRED, Jalopnik, and other outlets. Read MoreRead More
The Tesla Recall Is a Win for Tesla
And a loss for safety advocates.
More than 2 million Tesla vehicles are set to receive over-the-air updates to address failures in the Autopilot system, the carmaker’s much-hyped and oft-abused driver-assistance program. But the recall report published by the National Highway Transportation Safety Administration shows regulators are willing to keep risky technology on the road as long as the driver gets nagged enough.
What’s at issue with the recall is less Autopilot’s ability to brake and accelerate and more its Autosteer functionality, which allows the car to follow curves and make turns. According to NHTSA, “the prominence and scope of the feature’s controls may not be sufficient to prevent driver misuse.”
That “misuse” has been well documented in the years since Autopilot’s release. It began with Teslas being “hacked” with a water bottle to allow drivers to keep their hands completely off the wheel (and sometimes their bodies in the back seat); after that, researchers found that Autopiloted Teslas were involved in 273 crashes over a one-year period. Autopilot has been investigated in almost a dozen cases of vehicles crashing into emergency vehicles, and just this August, thousands of Autopilot complaints from German customers were leaked to Handelsblatt, a German business newspaper.
The initial NHTSA investigation began in 2021, and late this year U.S. regulators met with Tesla twice to address fixes. The automaker eventually decided to resolve the matter by voluntarily administering the recall — while, according to NHTSA, “not concurring with the agency’s analysis.”
While a 2 million-car recall isn’t something usually construed as a win, in this case, U.S. regulators did not conclude the technology itself was unsafe, and also determined that drivers are responsible for using Autopilot safely. This is what Tesla has contended since the beginning, and it’s a rebuke to safety advocates, many local legislators, and lawyers representing accident victims and their families.
Both Tesla and NHTSA point out that Autopilot is similar to other Level 2 automated driving systems offered by competing automakers — although these competitors have more cautiously waded into autonomy, building in myriad restrictions and ways to track driver focus. That’s in contrast to Tesla, which, despite ample contravening evidence and multiple lawsuits, still hosts a video of a Model X “self-driving” with no intervention from the passenger on its website.