top of page

Tesla Alters its Definition of Full Self Driving in Midst of Lawsuits

  • Musk Exposed
  • Sep 10
  • 3 min read

For years, Elon Musk has promised a future where Teslas could navigate roads on their own, allowing drivers to sit back and relax. At the center of this goal is "Full Self-Driving" (FSD) technology. However, Tesla is now making a significant change to its official description of FSD, a move that appears to be connected to a growing number of legal challenges.


The Quiet Change on Tesla's Website

Tesla crashes raises questions

Tesla has quietly updated the description of its FSD package on its website. The site now includes a disclaimer stating that the system "requires active driver supervision and does not make the vehicle autonomous." This small but important change clarifies that the technology, despite its name, is still a driver-assistance feature, not a true self-driving system. 


This update follows a decade of promises from Musk that a fully autonomous system was always just "a year" away. As Musk stated in 2023 after incorrectly predicting FSD capabilities for over 5 years: “I'm the boy who cried FSD, but I think we'll be better than human[s] by the end of this year. I've been wrong in the past, I may be wrong this time.”


It is clear from Tesla’s FSD history that Musk is still wrong. 


When the Technology Fails


A critical part of the public’s distrust of Musk’s FSD promises are the constant cases of accidents and danger that FSD continues to place people in. For example, a Tesla running FSD was involved in an accident that resulted in the death of a 22 year old woman after blowing through an intersection at 60 miles per hour and ran into a parked car that she was next to.


In another instance, a Tesla using FSD struck a 71 year old grandmother after failing to realize traffic stopping due to a crash and not detecting her because it was “blinded” by the glare of the sunlight. Adding to insult, Tesla took 7 months to report the crash details to the National Highway Traffic Safety Administration (NHTSA).  


These incidents fuel the debate about Tesla's choice to rely solely on cameras, a decision that has been championed by Musk. In a public comment, he questioned the use of other sensors, asking, "If lidars/radars disagree with cameras, which one wins?" He has also claimed that "sensor ambiguity causes increased, not decreased, risk" and that Tesla turned off radars "to increase safety."


However, leaked private messages from Musk present a different view. In a conversation about the topic, Musk reportedly wrote, "A very high resolution radar would be better than pure vision, but such a radar does not exist." This comment contradicts his public position and shows awareness of the limitations of the camera-only system.


Facing the Courts


As a result, the change in language isn't happening in a vacuum. Tesla is currently dealing with several lawsuits from customers and government regulators who claim the company has made false and misleading statements about its FSD technology.


A California judge recently allowed a class-action lawsuit to move forward, with plaintiffs alleging that Musk and Tesla misrepresented the capabilities of the vehicles' self-driving systems for eight years. The lawsuit centers on claims that FSD lacked the necessary hardware for advanced autonomy and never successfully demonstrated a long-distance, self-driving trip. The judge's decision noted that Tesla's marketing, including statements from Musk himself, were seen by enough customers to justify a group lawsuit.


In a separate legal action, the California Department of Motor Vehicles (DMV) is seeking to suspend Tesla's sales and manufacturing licenses in the state. The DMV argues that the use of terms like "Autopilot" and "Full Self-Driving" is deceptive marketing because it gives drivers the false impression that the cars can operate without a human in control.


 
 
 

Comments


bottom of page