Wednesday, May 29, 2024
HomeNewsTesla's Autopilot is facing unprecedented scrutiny. But, why?

Tesla’s Autopilot is facing unprecedented scrutiny. But, why?

Tesla's autopilot is facing a series of lawsuits and investigations over fatal car accidents involving Tesla electric vehicles.

Elon Musk has championed Tesla driver assistance Autopilot and Full Self-Driving (FSD) software as innovative advancements that can improve road safety while positioning the electric vehicle maker as a technology leader. Despite that, Tesla has been battling its biggest challenge ever since the launch of autopilot in 2015, as it faces severe scrutiny over the feature. 

Tesla’s autopilot is facing a series of lawsuits and investigations over fatal Tesla car accidents. However, it does make one wonder what may be causing Tesla to face such unprecedented scrutiny over its Autopilot system. Let’s look at some of the legal and regulatory challenges that Tesla has been dealing with: 


A Model S driver charged with manslaughter in 2019 after a fatal accident while using Tesla Autopilot in Los Angeles is facing a trial, which is set to start on November 15. Although Tesla was not charged, the Tesla system and its claims about it are expected to be in focus, as several US senators demanded an investigation. The trial is being watched closely by legal experts as a test case for the fault of a human driver in a car partly driving itself. The family of the people who died in the car crash has also sued the EV maker, claiming Tesla should have taken action to safeguard abuses of autopilot.

Read More: Tesla To Start Mass Production Of Cybertruck At The End Of 2023 

Another lawsuit against Tesla will go to trial in February. It concerns an accident that involved the death of 50-year-old Tesla Model 3 owner Jeremy Banner when his EV struck a tractor-trailer at the intersection of a highway in Florida. This will be the first civil lawsuit related to the autopilot that goes to trial. Apart from this, a 2018 crash of Tesla Model X killed the driver and an Apple engineer, Walter Huang, when it slammed into a concrete divider on a freeway in California’s Mountain View. A lawsuit by his wife suing Tesla is set to go to trial in March.

The National Transportation Safety Board investigated the Florida and California accidents and blamed both Tesla and the drivers. The NTSB said drivers rely too much on the Autopilot system, whereas Tesla failed to restrict autopilot use or adequately monitor driver attentiveness. 


Tesla has also been facing an investigation from the US Department of Justice, California’s DMV, and NHTSA. Tesla is under investigation in the US over the claims that the company’s EVs can drive themselves. The Department of Justice investigation could potentially conclude with criminal charges against Tesla or its executives.

California’s transportation regulator accused Tesla of “deceptive practices” of advertising in August, which suggested its driver assistance technology enabled autonomous vehicle control. The Department of Motor Vehicles (DMV) could potentially suspend Tesla’s license to sell EVs in California and will ask the company to make restitution to drivers. California’s DMV is also conducting an independent safety review which can force Tesla to apply for regulatory permits to operate its electric vehicles in California. 

In June, the National Highway Traffic Safety Administration upgraded its defect investigation into 830,000 Tesla vehicles with autopilot, a required step before it could seek a recall. The auto safety regulator is reviewing whether Tesla vehicles adequately ensure drivers are paying attention. Since 2016, NHTSA has opened nearly 40 special investigations involving 19 deaths in crashes involving Tesla vehicles.


It is evident from the above-mentioned lawsuits and investigations that Tesla’s Autopilot does come forward as a culprit in some scenarios. However, Tesla’s exact position remains unclear. Despite their names, Autopilot and Full Self-Driving have significant limitations. 

In a letter to California’s Department of Motor Vehicles, a Tesla lawyer acknowledged that Full Self-Driving struggled to react to a wide range of driving situations and should not be considered a fully autonomous driving system. Germany’s federal motor transport authority, KBA, found abnormalities while investigating Tesla’s autopilot function. Their software and sensors cannot control cars in many situations, which is why drivers need to keep their eyes on the road and hands close to the wheel. As of now, it seems only fair that Tesla abides with the ongoing lawsuits and investigations, if proven guilty, takes up responsibility for its shortcomings. 

Subscribe to our newsletter

Subscribe and never miss out on such trending AI-related articles.

We will never sell your data

Join our WhatsApp Channel and Discord Server to be a part of an engaging community.

Sahil Pawar
Sahil Pawar
I am a graduate with a bachelor's degree in statistics, mathematics, and physics. I have been working as a content writer for almost 3 years and have written for a plethora of domains. Besides, I have a vested interest in fashion and music.


Please enter your comment!
Please enter your name here

Most Popular