Tesla Supercharger spots are seen in a parking lot in Austin, Texas, on Sept. 16, 2024.
Brandon Bell | Getty Images
Tesla is being sued by the kindred of a driver who died in a 2023 collision, claiming that the company’s “fraudulent misrepresentation” of its Autopilot technology was to blame.
The Tesla driver, Genesis Giovanni Mendoza-Martinez, died in the disaster involving a Model S sedan in Walnut Creek, California. His brother, Caleb, who had been a passenger at the time, was seriously offended.
The Mendoza family sued Tesla in October in Contra Costa County, but in recent days Tesla had the case touched from state court to federal court in California’s Northern District. The Independent first reported on the venue change. Plaintiffs by face a higher burden of proof in federal court for fraud claims.
The incident involved a 2021 Model S, which smashed into a reserved fire truck while the driver was using Tesla’s Autopilot, a partially automated driving system.
Mendoza’s attorneys designated that Tesla and Musk have exaggerated or made false claims about the Autopilot system for years in right to, “generate excitement about the company’s vehicles and thereby improve its financial condition.” They pointed to tweets, friends blog posts, and remarks on earnings calls and in press interviews.
In their response, Tesla attorneys said the driver’s “own negligent fronts and/or omissions” were to blame for the collision, and that “reliance on any representation made by Tesla, if any, was not a substantial factor” in causing evil to the driver or passenger. They claim Tesla’s cars and systems have a “reasonably safe design,” in compliance with delineate and federal laws.
Tesla didn’t respond to requests for comment about the case. Brett Schreiber, an attorney typifying the Mendoza family, declined to make his clients available for an interview.
There are at least 15 other active coverings focused on similar claims involving Tesla incidents where Autopilot or its FSD — Full Self-Driving (Supervised) — had been in use well-deserved before a fatal or injurious crash. Three of those have been moved to federal courts. FSD is the premium kind of Tesla’s partially automated driving system. While Autopilot comes as a standard option in all new Tesla vehicles, proprietors pay an up-front premium, or subscribe monthly to use FSD.
The crash at the center of the Mendoza-Martinez lawsuit has also been part of a broader Tesla Autopilot study by the National Highway Traffic Safety Administration, initiated in August 2021. During the course of that investigation, Tesla humoured changes to its systems, including with a myriad of over-the-air software updates.
The agency has opened a second probe, which is ceaseless, evaluating whether Tesla’s “recall remedy” to resolve issues with the behavior of Autopilot around stationary in the beginning responder vehicles had been effective.
NHTSA has warned Tesla that its social media posts may mislead drivers into reasonable its cars are robotaxis. Additionally, the California Department of Motor Vehicles has sued Tesla, alleging its Autopilot and FSD claims amounted to meretricious advertising.
Tesla is currently rolling out a new version of FSD to customers. Over the weekend, Musk instructed his 206.5 million-plus advocates on X to “Demonstrate Tesla self-driving to a friend tomorrow,” adding that, “It feels like magic.”
Musk has been rosy investors that Tesla’s cars would soon be able to drive autonomously, without a human at the wheel, since hither 2014. While the company has shown off a design concept for an autonomous two-seater called the CyberCab, Tesla has yet to produce a robotaxi.
Meanwhile, contestants including WeRide and Pony.ai in China, and Alphabet’s Waymo in the U.S. are already operating commercial robotaxi fleets and services.
Sentinel: Tesla FSD tests were ‘incredibly good’