Tesla Inc. has found itself on a collision course with state and federal agencies — plus a customer who initiated a class-action lawsuit — over allegations of misleading the public regarding its cars' self-driving capabilities.
The legal actions could result in hefty fines, reimbursements and suspension of the company's California manufacturing license, which could impact its Fremont plant and retail stores.
On Wednesday, Sept. 14, attorneys for Burlingame-based Cotchett, Pitre & McCarthy, LLP filed the class-action lawsuit in the U.S. District Court in San Francisco on behalf of consumer Briggs Matsko of Rancho Murieta, a community about 25 miles east of Sacramento. Matski bought a Tesla Model X compact SUV and additionally paid about $15,000 for Tesla's "Enhanced Autopilot" option.
The lawsuit and the California Department of Motor Vehicles in separate actions accuse Tesla of pushing forth systems that don't perform as advertised and assert the company is aware of the systems' limitations and even dangers they could pose. The automaker continued a false narrative that full autonomy was just around the corner through system updates.
Tesla's practice in essence made consumers the test subjects of its software, the 84-page class action lawsuit claims.
The allegations concern Tesla's Autopilot, Enhanced Autopilot and Full Self-Driving Capability ("FSD") technology, which the class-action complaint alleges was deceptively marketed as either already fully functional or near to be fully functioning since at least 2016, according to attorneys for the plaintiff.
"Tesla has deceived and misled consumers regarding the current abilities of its … technology and by representing that it was perpetually on the cusp of perfecting that technology and finally fulfilling its promise of producing a fully self-driving car," the suit claims. "Although these promises have proven false time and time again, Tesla and Musk have continued making them to generate media attention, to deceive consumers into believing it has unrivaled cutting-edge technology, and to establish itself as a leading player in the fast-growing electric vehicle market," the lawsuit alleges.
They noted a 2016 video published on its website in October 2016 purporting to show a Tesla car autonomously driving.
The video opens with the message, "The person in the driver's seat is only there for legal reasons. He is not driving anything. The car is driving itself."
But Tesla employees who created the video would later reveal that the car in the video had significant assistance from commercial mapping software that was not available to Tesla customers. The car still performed poorly and even ran into a fence during filming, the complaint states.
"With the assistance of a large team of Tesla engineers, the car had to run the same route over and over again before Tesla got acceptable video that appeared to show a car capable of driving itself. Even though the video was debunked as deceptive and misleading years ago, Tesla continues to prominently feature it on its website."
The lawsuit alleges Tesla hasn't produced a fully self-driving car; instead, owners receive "updates" of Tesla's Autopilot software and Full Self-Driving Capability beta software.
Owners reported problems that have included the cars having difficulty making routine turns, running red lights and steering into oncoming traffic. Collisions involving Tesla's software have included vehicles crashing at high speeds into large stationary objects such as emergency vehicles and an overturned box truck, according to the complaint.
The software has also been tied to deaths, including that of Apple engineer Walter Huang, who was killed in March 2018 when the Autopilot techology on his Tesla Model X became confused at a fork on the highway and caused the car to veer sharply and crash into a concrete barrier in Mountain View, the complaint noted.
"People have suffered fatal and other serious injuries as a result of the Tesla's autopilot and self-driving technology, triggering investigations by the National Highway Traffic Safety Administration, the National Transportation Safety Board, and other regulators," the law firm said in a statement.
Tesla continued to push the narrative of a safe, autonomous-capable vehicle from 2017 to 2019 through its advertising:
"All you will need to do is get in and tell your car where to go. If you don't say anything, the car will look at your calendar and take you there as the assumed destination or just home if nothing is on the calendar. Your Tesla will figure out the optimal route, navigate urban streets (even without lane markings), manage complex intersections with traffic lights, stop signs and roundabouts, and handle densely pack (sic) freeways with cars moving at high speed."
Yet its upgraded software continued to be implicated in crashes and consumer complaints. In October 2021, an update to the full self-driving beta software caused a major increase in "phantom braking" incidents, where the software identifies a nonexistent threat that triggers the vehicle's emergency braking system. Tesla vehicles, traveling at various speeds, suddenly slammed on the brakes for no apparent reason, according to the lawsuit.
Tesla initially claimed it had identified the source of the problem and fixed it with a software update released on Oct. 25, 2021, but the company issued a formal recall for more than 11,0000 cars using the software, according to the complaint.
"Tesla's claims of having fixed the problem, however, turned out to be false, as driver complaints about 'phantom braking' issues soared to 107 NHTSA (National Highway Traffic Safety Administration) complaints in the three-month period of November 2021 through January 2022 (compared with only 34 such complaints in the preceding 22 months)," the complaint said.
Tesla is also facing scrutiny by federal and state regulators.
Federal investigations
In June 2021, the National Highway Traffic Safety Administration (NHTSA) issued an order requiring automobile manufacturers to report any crash involving an injury, fatality or property damage that occurred while or immediately after a vehicle is using some automated driving tasks.
The NHTSA opened a preliminary safety-defect investigation into Tesla's Autopilot technology in August 2021, which could affect an estimated 830,000 Tesla vehicles from 2014-2022, including models Y, X, S and 3.
Two U.S. senators called for the Federal Trade Commission to investigate Tesla's potentially deceptive marketing practices surrounding its full self-driving technology.
DMV seeks to suspend Tesla's license
On July 28 of this year, the California Department of Motor Vehicles, which licenses motor vehicle manufacturers and dealerships in the state, including Tesla's Fremont factory and Tesla retail stores, brought two administrative enforcement actions against Tesla. The complaints allege the company engaged in deceptive advertising regarding its advanced driver assistance systems on at least five dates from May 28, 2021 through July 12, 2022.
The Tesla cars "equipped with those ADAS (advanced driver assistance systems) features could not at the time of those advertisements, and cannot now, operate as autonomous vehicles," the DMV complaint alleges.
The complaint asks the DMV's Office of Administrative Hearings to suspend or revoke Tesla's manufacturer license and to cause the company to owe restitution to the persons or institutions who have suffered financial loss or damage due to the alleged deceptions.
Tesla has not returned requests for comment regarding these allegations, but the company's attorneys stated to the DMV on multiple occasions that its technology wasn't fully autonomous, according to the class-action lawsuit and the DMV complaint.
On Dec. 28, 2020, in a letter to the California DMV, Tesla's legal counsel admitted the limitations of Tesla's full self-driving technology, saying it is "an additional optional suite of features that builds from Autopilot." Features include Navigate on Autopilot, Auto Lane Change, Autopark, Summon, Smart Summon, Traffic and Stop Sign Control, and, the upcoming Autosteer on City Streets (City Streets).
"While we designed these features to become more capable over time through over-the-air software updates, currently neither Autopilot nor FSD Capability is an autonomous system, and currently no comprising feature, whether singularly or collectively, is autonomous or makes our vehicles autonomous. This includes the limited pilot release of City Streets," the company said.
Tesla has published disclaimers, including one from June 28, 2022, which stated in part: "The currently enabled features require active driver supervision and do not make the vehicle autonomous," the DMV complaint noted, but they are not enough.
"However, the disclaimer contradicts the original untrue or misleading labels and claims, which is misleading, and does not cure the violation," DMV said.
Tesla moved its headquarters from Palo Alto to Austin in December 2021.
Comments
Registered user
Barron Park
on Sep 19, 2022 at 10:51 am
Registered user
on Sep 19, 2022 at 10:51 am
I got duped into paying for the FSD option when I leased the model 3. Tesla declined to refund me the option after it was clear no meaningful functionality is going to be delivered for the duration of my lease.
Registered user
Downtown North
on Sep 19, 2022 at 11:02 am
Registered user
on Sep 19, 2022 at 11:02 am
Hmm. I wonder if this issue will stir up Ralph Nader and his 1965 book Unsafe At Any Speed. How many people remember very complex Nader chapter in American history books? Never underestimate the power of trial attorneys.
Registered user
Mountain View
on Sep 19, 2022 at 12:38 pm
Registered user
on Sep 19, 2022 at 12:38 pm
Finally Tesla is being held accountable for its, IMHO, slipshod, cheaply, poorly designed, and lethal "self-driving" software and hardware. IMHO, Musk is a megalomaniac conman who should start his own religion, just like Sci Fi writer did with Scientology --- and got very wealthy.
Registered user
Old Palo Alto
on Sep 19, 2022 at 1:04 pm
Registered user
on Sep 19, 2022 at 1:04 pm
Thank you, Sue Dremann, and the Palo Alto Weekly, for this fantastic piece of reporting.
Those videos on the Tesla website have been claiming that autonomous driving was "right around the corner!" for years, and Tesla has taken countless $15,000 payments for the tech, even though Tesla had to know that the functionality was nowhere close to prime time. And if Tesla did not know that in 2016, it sure knew it in 2018, when numerous accidents, several deadly, had been reported. Come 2022, and is Tesla still taking $15,000 for a product that does not, and has never, worked? If so, Tesla is liable.
(I also don't like Tesla's tracking technology, as I'd rather Elon Musk not know exactly where my family is. Plus I think there are ample great alternatives on the market -- we bought an all-electric Mini Cooper ES to replace my 2012 Leaf last year, as well as a Ford Mustang Mach-E GT, both of which are all electric and high-performing ... although it's better not to drive!)
Tesla has been a victory of marketing, and I don't blame the customers for buying into it. Unfortunately, Tesla's marketing, while arguably phenomenal, has overreached. This sounds like a winning lawsuit, and I think that the plaintiffs, the DMV, and the other government regulators are correct that when it comes to Tesla's self-driving functionality, Tesla crossed the line from aggressive advertising, to intentional misrepresentation.
I wouldn't mind Tesla being shut down in California, given the large number legally compliant options, but to me, at very least Tesla should not be allowed to sell the $15,000 upgrades -- and it should be forced to reimburse all customers who paid.
NB: although Tesla moved its "official" headquarters to Texas, it has been *expanding* in Palo Alto. According to reliable news sources, Tesla is moving into the former HP site off Page Mill:
Web Link
Web Link
Registered user
Palo Verde
on Sep 19, 2022 at 1:08 pm
Registered user
on Sep 19, 2022 at 1:08 pm
Has worked well for me. I use as directed: let the car drive AND be prepared for problems at all times.
Registered user
Mountain View
on Sep 19, 2022 at 7:26 pm
Registered user
on Sep 19, 2022 at 7:26 pm
The accident in 2018 in Mountain View is not a good accident to reference. Huang, if memory serves, hit a steel safety barrier collapsed from a previous accident and not repaired. It ripped the speeding car in half as can be seen. It was clear that Huang was not following the instructions to supervise the car or himself tried to change the car's direction too late. Too close to the barrier a car would lose grip as there was small debris around it. I drove and continue to drive through there as a Mountain View resident. That "Y" is where if you go left you go on 85 to Cupertino, if right then you continue on 101. It's no place to speed. That was a terrible accident but too much of the blame, IMHO, must be with the driver who was a software engineer at Apple so not a naïve user. There were doubtless suits filed after the accident with more information. Perhaps they disagree.
Tesla says that their system net saves people. That is, its accident rate per million miles is lower than drivers. They are also trying to do non-geofenced full self driving - the most difficult problem, apparently not yet sufficiently solved. It is true that a zero-accident system will likely never be devised though the issue here is advertising. But tort lawyers will demand zero, of course, though the states would not given the endless body count. IMO, Tesla was in error in getting rid of front facing radar in favor of cameras only. Hitting big objects or phantom braking would be reduced.
Recently after dark with a little rain I was stopped at a light and an almost invisible pedestrian crossed, dark, non-reflective clothing head to foot. I doubt that a camera would see him unless it's good into infrared.
Registered user
another community
on Sep 20, 2022 at 6:14 pm
Registered user
on Sep 20, 2022 at 6:14 pm
Tesla does need to be held accountable for misleading consumers, but consumers need to wise up. If you trust self-driving tech, you're naive. If technology could be trusted your computer wouldn't have the issues it has, your TV wouldn't lose connection during a good movie or ballgame and employers wouldn't have to hire an IT department.
The average motorist with a clean driving record is hard enough to trust on the road yet alone "self-driving." What were consumers thinking?
Registered user
Embarcadero Oaks/Leland
on Sep 20, 2022 at 6:59 pm
Registered user
on Sep 20, 2022 at 6:59 pm
Thanks for the article. Remember that Musk/Tesla has as little regard for health and safety of its workers as for its customers. Remember that during the height of the pandemic it refused to let its workers distance themselves and lots of them got sick? And then there's the hostile workplace environment with all the race and sex discrimination suits.
Registered user
Midtown
on Sep 23, 2022 at 1:28 pm
Registered user
on Sep 23, 2022 at 1:28 pm
What I'm wondering is when the safety authorities will investigate the driver-distraction issue of that touchscreen-only dashboard on the Model 3 etc. My Leaf, like nearly every car (electric or not) on the road, has all the important controls (headlights, wipers, defroster, heat, AC) operable by touch or at worst by a brief glance. I don't want to be poking a touchscreen at 70mph and I don't really want to share the freeway with someone who is.
Tesla has the best batteries and by far the best charging network, but for me that touchscreen is a deal-breaker. Do we have any reliable statistics on driver-distraction crashes in Teslas?