Getting your Trinity Audio player ready...

A Tesla driver who died after striking a highway median in Mountain View reportedly complained of problems with his Autopilot and navigation systems in the weeks leading up to the crash in 2018, according to a trove of newly released documents.

Federal investigators at the National Transportation Safety Board (NTSB) released nearly 1,500 pages of information on the 2018 fatal accident, in which 38-year-old Walter Huang’s Model X collided with the barrier between southbound Highway 101 and the Highway 85 carpool flyover. The investigation is looking into whether the highway conditions and the vehicle’s Autopilot lane-keeping assistance played a role in the crash.

While the agency has yet to make a determination, an attorney representing Huang’s family asserted in a letter last year that the vehicle’s Autopilot had been a problem, particularly at the location of the crash. Huang reportedly told his wife, Sevonne, and a friend that his vehicle’s lane-keeping technology was problematic and had a tendency to steer towards the median, also known as the gore point.

“Walter told Sevonne the Autopilot would cause his Tesla to veer towards the barrier involved in his crash, prior to the crash,” according to Mark Fong, an attorney with Minami Tamaki.

Tesla representatives did not immediately respond to requests for comment.

Vehicle maintenance records show that, two weeks prior to the crash, Huang brought his car to a Sunnyvale service center reporting problems with his GPS and navigation system that prevented his cruise control from functioning. A service adviser reportedly was unable to duplicate the problem during the visit, and had “no recollection whether the driver told him about problems encountered while driving vehicle in the vicinity of the gore area on US-101,” according to the documents.

An earlier report released by NTSB found that the Tesla’s Autopilot system, shorthand for a suite of functions including adaptive cruise control and autosteer lane-keeping assistance, was enabled at the time of the crash. Huang’s vehicle was in a lane traveling south on Highway 101 when it moved left and sped up from 62 mph to 70.8 mph. No “precrash braking or evasive steering” movement was detected.

The severe damage to the Tesla breached the battery, causing it to catch fire shortly after the crash. Though bystanders were able to pull Huang from the vehicle just before it was engulfed in flames, he later died of his injuries.

NTSB will be holding a board meeting on Feb. 25 to determine the probable cause of the fatal crash. In a previous report, the agency slammed Caltrans for “systemic problems” that prevented the swift repair of traffic safety equipment that could have lessened the severity of the crash. Caltrans is responsible for maintaining a crash attenuator at the site of the collision, which is equipped with a hydraulic cylinder and cable assembly designed to “telescope” and absorb impact when a vehicle hits it at high speeds.

The attenuator located at the highway median had been smashed by a Prius in a solo-vehicle accident 11 days before the Tesla crash and was damaged to the point of being “nonoperational,” and had not yet been replaced.

Alongside the NTSB investigation, Huang’s family has also filed a wrongful death suit in Santa Clara County Superior Court. The suit alleges that Autopilot, while marketed as safe features designed to prevent crashes, should have prevented Huang’s Model X from accelerating into a fixed object on the road.

The full public docket released by NTSB can be reviewed online at https://go.usa.gov/xd9u9.

Kevin Forestieri writes for the Mountain View Voice, the sister publication of PaloAltoOnline.com.

Kevin Forestieri writes for the Mountain View Voice, the sister publication of PaloAltoOnline.com.

Kevin Forestieri writes for the Mountain View Voice, the sister publication of PaloAltoOnline.com.

Kevin Forestieri writes for the Mountain View Voice, the sister publication of PaloAltoOnline.com.

Kevin Forestieri writes for the Mountain View Voice, the sister publication of PaloAltoOnline.com.

Kevin Forestieri is the editor of Mountain View Voice, joining the company in 2014. Kevin has covered local and regional stories on housing, education and health care, including extensive coverage of Santa...

Join the Conversation

16 Comments

  1. Was this crash caused by defective technology or from a user not understanding the limits of the technology? What does the Tesla owner’s manual say the system is supposed to do in situations like this?

  2. It seems the lane keeping system functioned flawlessly on that day, centering the vehicle precisely between the lane markers bounding the diverging gore that was leading it directly to the very beefy barrier. Human drivers easily recognize gores and avoid driving in them, but the hi-tech Tesla eagerly guided itself to destruction in one

  3. Posted by resident, a resident of Menlo Park

    >> Was this crash caused by defective technology or from a user not understanding the limits of the technology?

    I’m sorry for the family’s loss, but it sounds like some of each. But, an often overlooked aspect of this and several other accidents is the apparent vulnerability of the battery system to fires caused by thermal runaway of the Li-ion batteries after severe mechanical damage. That is what I would like to know more about.

  4. He complained of the malfunction weeks prior to his death, and told wife and friends it was promblematic yet he still attempted to use that feature on a busy freeway. This terrible tragedy could have been prevented if common sense prevailed. Autopilot implies a self driving car, clearly it is not. Condolences to the family.

  5. If one has to rely on autopilot rather than actually driving the car & paying attention to the road, a possible suggestion…use Lyft or Uber to get from point A to point B OR take the bus! *DUH*

  6. I know of someone who mistook the ‘cruise control’ setting on his RV for autopilot.

    After setting it at 55 mph, he went towards the back of the RV to make a sandwich & the vehicle eventually went off the road and into a ditch.

    He got a little bruised up & wasn’t able to finish making his sandwich.

    A stupid old man. I imagine he never heard Roadhouse Blues by The Doors…

    “Keep your eyes on the road, your hands upon the wheel”

  7. I feel badly for his family but I think the very fact that he knew the autopilot was not working correctly would mean he is obliged to not depend on it and should be driving “the old fashioned way”? This is no longer Tesla’s fault.

  8. Sooo……
    The family says he knew the autopilot did not work at the very place the accident happened, and he was using the autopilot at the time of the accident ?

    Whose defense are they providing ?

  9. This is not new information.
    I remember reading at the time that he had told his wife that the car acted funny when he passed that point on the highway.

  10. I forgot to mention that someone from Tesla should go to jail for releasing this software to the public. It has killed at least two people by running them into stationary objects (the semi was moving perpendicular to the Tesla in the previous crash, so it was effectively stationery).
    You can call a video game “beta” but calling car software, which will kill some of its drivers, “beta,” doesn’t absolve you of liability.
    What happens when one of these crashing Teslas hits a van full of children?
    By the way, regarding the “driver should pay attention every second” warning: first, this is physiologically impossible, while driving a Tesla, piloting a plane on autopilot, watching the controls of a nuclear power plant, or doing anything else. Your brain just isn’t capable of paying attention every second while doing nothing, and it is certainly not capable of instantly taking control in the emergency, even if you happened to be paying attention.
    Second: since the driver knew that the car acted funny at this particular spot, it’s incredibly unlikely that he was *not* paying attention on the field day.

  11. >>He complained of the malfunction weeks prior to his death, and told wife and friends it was promblematic yet he still attempted to use that feature on a busy freeway.

    But was he using it on purpose? Might the system have been unintentionally engaged by a faulty switch that would not turn off? What was the precise issue he reported to Tesla that Tesla mechanics could not replicate? A faulty switch might well exhibit this kind of intermittent behavior. Are you reading this, Mr. Musk?

  12. According to the story, he told his wife that autopilot had problems at the exact location where his car crashed. He reported to Tesla that his car’s autopilot was not working properly. He also told a friend the same thing.

    He then turned on autopilot in the exact area he said was a problem, while traveling at over 60 miles an hour.

    He crashed. His family then sued both Tesla and the city.

    Data from the car shows acceleration to just over 70mph just before the crash. Data from the car shows no pre-crash maneuvers taken by the driver or car. Data from the car also shows the driver was alerted multiple times to put his hands on the wheel and the data shows he did not do that.

    My car has automated driver features. I know the areas where it has issues and I simply take over in those areas.

    Looking at all the data, I can’t come to a conclusion other than that the driver wanted his car to crash.

    One question from looking at the raw car data, why does the raw data from the vehicle show a speed of 179 MPH about 30 minutes before the crash?

  13. The only safe autopilot cars
    these days are WAYMOS.

    They go so SLOW & the driver/operators so overcautious that even an ant wouldn’t get run over.

    On the other hand, a WAYMO is ideally designed for lousy and/or easily distracted driver.

  14. Posted by Sketchy, a resident of Palo Alto High School

    >> One question from looking at the raw car data, why does the raw data from the vehicle show a speed of 179 MPH about 30 minutes before the crash?

    What is the source for that information?

Leave a comment