According to a California Highway Patrol (CHP) traffic crash report, a Tesla driver claimed that their vehicle’s “Full Self-Driving (FSD)” software unexpectedly braked and caused an eight-car pileup in the Yerba Buena Tunnel last month. Nine people were treated for minor injuries, including one child who was hospitalized. The CHP analyzed tunnel footage and discovered that the Tesla performed an illegal lane change before suddenly slowing down from 55 mph to 20 mph, forcing vehicles behind it to collide with one another, according to the December 7th report received by CNN Business.
The above incident can be viewed as another incident caused by the automaker’s US$15,000 Full Self-Driving software package. But can we still afford to ignore questioning the facade of hype behind the technology that is being marketed to bring the next revolution in the autonomous vehicle industry? Is it possible that beyond the promise to advance the scope of self-driving vehicles, Tesla might have overstated the capabilities of its full-self driving software?
This year the National Highway Traffic Safety Administration (NHTSA) has been investigating multiple cases where Tesla’s full-self driving or advanced driver-assistance system (ADAS) played an unfortunate role in inadvertently causing the accidents.
Tesla currently comes with a standard driving assistance feature dubbed Autopilot in all of its new vehicles. Additionally, it offers extra functions, viz., Smart Summon, Navigate on Autopilot, and Automatic Lane Changes, in a package that is commonly marketed as Full Self-Driving. Under its FSD Beta program, the company also permits select owners to access and test features — which have not yet been entirely bug-fixed — on public roads. The software is designed to keep up with traffic, navigate within the lane and adhere to traffic signals. It requires a careful human driver who is ready to take over complete control of the vehicle at any time. While some drivers have been thrilled by this software, many are concerned that a Tesla outfitted with FSD would misbehave at the worst possible time.
The “full self-driving” beta, which became available to everyone in North America since November, has proven to be worrisome for many Tesla customers who paid US$15,000 for the software upgrade, believing in Tesla CEO Musk Elon’s claims. This is because the program occasionally tries to strike curbs or travel on the wrong side of the road. While Tesla is continually improving the technology and addressing its flaws, beta testers’ experiences offer a glimpse into the incredibly risky and expensive gamble the company is placing on its so-called full self-driving technology.
The ‘full-self driving’ was initially conceptualized as a technology that can help vehicles maneuver through the roads without any human assistance. However, if Tesla requires human drivers to take over in case of any malfunction of its “FSD software,” the term has already lost its original meaning. In order to rekindle the hype around fully autonomous vehicles, major companies like Tesla, Waymo announced that ‘full-self driving’ would be gradually introduced, starting with testing the vehicles in geographically restricted areas. Though this year, we have witnessed many companies testing autonomous robo-taxis in streets of Las Vegas, Phoenix, and Los Angeles, it will take decades for them to expand to a level that even comes close to countrywide deployment at the current rate. Consequently, costs and the organizational learning curve have been substantially higher and longer than anticipated. The companies have discovered that the technological prerequisites to allow mass adoption of the technology are far more challenging than they had initially anticipated, despite investing billions of dollars in its development over the course of a decade.
In hindsight, Tesla’s “Full Self-Driving” software is more akin to a “Level 2” advanced driver assistance system that requires constant active supervision by a driver, despite the advertising. There is concrete evidence to back this claim too! The DMV and Tesla were exchanging emails in 2019 and 2020 that were disclosed by Plainsite in response to a public records request revealing the company’s Full Self-Driving mode, also known as City Streets, was a Level 2 technology while Musk was making audacious claims about fully autonomous vehicles. This proves that Tesla’s technology is no more capable of autonomous driving than rival driver-assistance systems offered by companies under the Level 2 category.
If you account for the reality of the situation, why does Tesla continues to market itself as the developer of ‘full-self driving’ software for its vehicles? Is it blatant ignorance to steer the sales by capitalizing on the misinformed hype or an optimistic bet? While Tesla doesn’t claim the software to enable fully autonomous driving, does it set a dangerous precedent?
For now, we have a temporary yet effective solution to this dilemma: California lawmakers have recently passed a new law prohibiting Tesla from labeling its software ‘Full Self-Driving!’ The new law, sponsored by Democratic state Sen. Lena Gonzalez of Long Beach and signed by Gov. Gavin Newsom this legislative session, prevents car dealers and manufacturers in California from “deceptively naming or marketing” a car as self-driving if it’s outfitted with only partial automation features which still necessitate human drivers to pay attention and handle driving.
Gonzalez informed the Los Angeles Times that the state Department of Motor Vehicles already has regulations against the misleading advertising of self-driving vehicles. However, the DMV’s lack of enforcement pushed her and state legislators to introduce legislation to incorporate the standards into state law.
The new bill, Senate Bill 1398, is one of the hundreds of new state regulations that will go into effect in 2023. It explicitly targets Tesla’s promotion of software contained in some Tesla models that implies the car can fully drive itself. According to Gonzalez, the bill increases consumer safety by mandating dealers and manufacturers who sell new passenger vehicles equipped with a semiautonomous driving assistance feature to include a comprehensive description of the capabilities and limitations of those systems.
It is important to note that the new bill does not address the safety concerns surrounding the Full-self driving software. However, it is the most recent instance of politicians, regulators, and customers fighting back against what they claim to be false and misleading advertising. In response, Tesla fought against the law, claiming that it already makes Tesla owners aware of the limits of the Full Self-Driving software.
As tensions mount, Tesla may have to come clean about their litany of bogus claims about rolling ‘Level 5 autonomous vehicles,’ made by Musk every year. The takeaway is simple: Tesla cars must be subject to the same testing regulations as other autonomous vehicles that are now on our roads if they are sufficiently automatic to be advertised as Full Self-Driving. If the cars are not sufficiently automated to be regulated as autonomous vehicles, Tesla should be barred from marketing the technology as Full Self-Driving. Therefore, the California government is right in asking companies like Tesla to refrain from misleading people under the pretense of offering fully autonomous technologies.
Though, banning Tesla from advertising vehicles as self-driving if they still require driver supervision is a historic milestone, much needs to be done. Even if a car is capable of operating safely in all circumstances, drivers will still need to be on guard and prepared to take over if necessary.
Tesla has chosen to make its self-driving technology available to consumers, unlike other self-driving car companies like Waymo and Cruise, who test their vehicles in carefully monitored pilot projects. To minimize the risk of regular drivers facing risks of accidents or software malfunction, NHTSA should come up with a preapproval system before installation. It should also come up with certifications, as DMV offers to run autonomous vehicles in California, before the four-wheelers hit the roads. These are important as the self-driving automobile currently lacks a real industry software and hardware standard.
While companies like Tesla are aiming for fully autonomous driving, it does not imply eliminating the scope of driver assistance. In addition to sending frequent information on crashes and instances to DMV, where the human driver had to take over to prevent a crash, Tesla cars must have a certified and trained test driver operating the vehicle. Further, NHTSA should come up with some regulations that allow it to take action whenever Tesla launches software updates or recalls software features, irrespective of mode – via the internet or directly to drivers. This can address the governance blindspot that arises when autonomous vehicle companies add new features or patch software flaws remotely, triggering concerns about liability, accountability, and safety.