New documents show man killed in fiery Tesla crash reported problems with Autopilot system | News | Palo Alto Online |

News

New documents show man killed in fiery Tesla crash reported problems with Autopilot system

Walter Huang died after car crashed into carpool lane barrier in Mountain View

A Tesla driver who died after striking a highway median in Mountain View reportedly complained of problems with his Autopilot and navigation systems in the weeks leading up to the crash in 2018, according to a trove of newly released documents.

Federal investigators at the National Transportation Safety Board (NTSB) released nearly 1,500 pages of information on the 2018 fatal accident, in which 38-year-old Walter Huang's Model X collided with the barrier between southbound Highway 101 and the Highway 85 carpool flyover. The investigation is looking into whether the highway conditions and the vehicle's Autopilot lane-keeping assistance played a role in the crash.

While the agency has yet to make a determination, an attorney representing Huang's family asserted in a letter last year that the vehicle's Autopilot had been a problem, particularly at the location of the crash. Huang reportedly told his wife, Sevonne, and a friend that his vehicle's lane-keeping technology was problematic and had a tendency to steer towards the median, also known as the gore point.

"Walter told Sevonne the Autopilot would cause his Tesla to veer towards the barrier involved in his crash, prior to the crash," according to Mark Fong, an attorney with Minami Tamaki.

Tesla representatives did not immediately respond to requests for comment.

Vehicle maintenance records show that, two weeks prior to the crash, Huang brought his car to a Sunnyvale service center reporting problems with his GPS and navigation system that prevented his cruise control from functioning. A service adviser reportedly was unable to duplicate the problem during the visit, and had "no recollection whether the driver told him about problems encountered while driving vehicle in the vicinity of the gore area on US-101," according to the documents.

An earlier report released by NTSB found that the Tesla's Autopilot system, shorthand for a suite of functions including adaptive cruise control and autosteer lane-keeping assistance, was enabled at the time of the crash. Huang's vehicle was in a lane traveling south on Highway 101 when it moved left and sped up from 62 mph to 70.8 mph. No "precrash braking or evasive steering" movement was detected.

The severe damage to the Tesla breached the battery, causing it to catch fire shortly after the crash. Though bystanders were able to pull Huang from the vehicle just before it was engulfed in flames, he later died of his injuries.

NTSB will be holding a board meeting on Feb. 25 to determine the probable cause of the fatal crash. In a previous report, the agency slammed Caltrans for "systemic problems" that prevented the swift repair of traffic safety equipment that could have lessened the severity of the crash. Caltrans is responsible for maintaining a crash attenuator at the site of the collision, which is equipped with a hydraulic cylinder and cable assembly designed to "telescope" and absorb impact when a vehicle hits it at high speeds.

The attenuator located at the highway median had been smashed by a Prius in a solo-vehicle accident 11 days before the Tesla crash and was damaged to the point of being "nonoperational," and had not yet been replaced.

Alongside the NTSB investigation, Huang's family has also filed a wrongful death suit in Santa Clara County Superior Court. The suit alleges that Autopilot, while marketed as safe features designed to prevent crashes, should have prevented Huang's Model X from accelerating into a fixed object on the road.

The full public docket released by NTSB can be reviewed online at https://go.usa.gov/xd9u9.

---

Follow the Palo Alto Weekly/Palo Alto Online on Twitter @PaloAltoWeekly and Facebook for breaking news, local events, photos, videos and more.

Kevin Forestieri writes for the Mountain View Voice, the sister publication of PaloAltoOnline.com.

What is democracy worth to you?
Support local journalism.

Comments

6 people like this
Posted by resident
a resident of Menlo Park
on Feb 12, 2020 at 12:38 pm

Was this crash caused by defective technology or from a user not understanding the limits of the technology? What does the Tesla owner's manual say the system is supposed to do in situations like this?


8 people like this
Posted by An Engineer
a resident of Duveneck/St. Francis
on Feb 12, 2020 at 8:29 pm

It seems the lane keeping system functioned flawlessly on that day, centering the vehicle precisely between the lane markers bounding the diverging gore that was leading it directly to the very beefy barrier. Human drivers easily recognize gores and avoid driving in them, but the hi-tech Tesla eagerly guided itself to destruction in one


4 people like this
Posted by Resident
a resident of Another Palo Alto neighborhood
on Feb 12, 2020 at 10:05 pm

Or was he playing a video game on his phone? Web Link


2 people like this
Posted by Anon
a resident of Another Palo Alto neighborhood
on Feb 13, 2020 at 10:02 am

Posted by resident, a resident of Menlo Park

>> Was this crash caused by defective technology or from a user not understanding the limits of the technology?

I'm sorry for the family's loss, but it sounds like some of each. But, an often overlooked aspect of this and several other accidents is the apparent vulnerability of the battery system to fires caused by thermal runaway of the Li-ion batteries after severe mechanical damage. That is what I would like to know more about.


9 people like this
Posted by Tesla at fault, but why did he use a faulty feature?
a resident of another community
on Feb 13, 2020 at 10:41 am

He complained of the malfunction weeks prior to his death, and told wife and friends it was promblematic yet he still attempted to use that feature on a busy freeway. This terrible tragedy could have been prevented if common sense prevailed. Autopilot implies a self driving car, clearly it is not. Condolences to the family.


7 people like this
Posted by Watch The Road!
a resident of Adobe-Meadow
on Feb 13, 2020 at 3:00 pm

If one has to rely on autopilot rather than actually driving the car & paying attention to the road, a possible suggestion...use Lyft or Uber to get from point A to point B OR take the bus! *DUH*


6 people like this
Posted by Roadhouse Blues
a resident of Barron Park
on Feb 13, 2020 at 4:22 pm

I know of someone who mistook the 'cruise control' setting on his RV for autopilot.

After setting it at 55 mph, he went towards the back of the RV to make a sandwich & the vehicle eventually went off the road and into a ditch.

He got a little bruised up & wasn't able to finish making his sandwich.

A stupid old man. I imagine he never heard Roadhouse Blues by The Doors...

"Keep your eyes on the road, your hands upon the wheel"


1 person likes this
Posted by Dt north
a resident of Downtown North
on Feb 13, 2020 at 5:04 pm

I feel badly for his family but I think the very fact that he knew the autopilot was not working correctly would mean he is obliged to not depend on it and should be driving “the old fashioned way”? This is no longer Tesla’s fault.


6 people like this
Posted by Old Joe
a resident of Barron Park
on Feb 13, 2020 at 5:18 pm

Sooo......
The family says he knew the autopilot did not work at the very place the accident happened, and he was using the autopilot at the time of the accident ?

Whose defense are they providing ?


2 people like this
Posted by Paul
a resident of another community
on Feb 13, 2020 at 8:23 pm

This is not new information.
I remember reading at the time that he had told his wife that the car acted funny when he passed that point on the highway.


2 people like this
Posted by Paul
a resident of another community
on Feb 13, 2020 at 8:33 pm

I forgot to mention that someone from Tesla should go to jail for releasing this software to the public. It has killed at least two people by running them into stationary objects (the semi was moving perpendicular to the Tesla in the previous crash, so it was effectively stationery).
You can call a video game "beta" but calling car software, which will kill some of its drivers, "beta," doesn't absolve you of liability.
What happens when one of these crashing Teslas hits a van full of children?
By the way, regarding the "driver should pay attention every second" warning: first, this is physiologically impossible, while driving a Tesla, piloting a plane on autopilot, watching the controls of a nuclear power plant, or doing anything else. Your brain just isn't capable of paying attention every second while doing nothing, and it is certainly not capable of instantly taking control in the emergency, even if you happened to be paying attention.
Second: since the driver knew that the car acted funny at this particular spot, it's incredibly unlikely that he was *not* paying attention on the field day.


2 people like this
Posted by An Engineer
a resident of Duveneck/St. Francis
on Feb 13, 2020 at 8:46 pm

>>He complained of the malfunction weeks prior to his death, and told wife and friends it was promblematic yet he still attempted to use that feature on a busy freeway.

But was he using it on purpose? Might the system have been unintentionally engaged by a faulty switch that would not turn off? What was the precise issue he reported to Tesla that Tesla mechanics could not replicate? A faulty switch might well exhibit this kind of intermittent behavior. Are you reading this, Mr. Musk?


2 people like this
Posted by Sketchy
a resident of Palo Alto High School
on Feb 14, 2020 at 6:13 am

According to the story, he told his wife that autopilot had problems at the exact location where his car crashed. He reported to Tesla that his car's autopilot was not working properly. He also told a friend the same thing.

He then turned on autopilot in the exact area he said was a problem, while traveling at over 60 miles an hour.

He crashed. His family then sued both Tesla and the city.

Data from the car shows acceleration to just over 70mph just before the crash. Data from the car shows no pre-crash maneuvers taken by the driver or car. Data from the car also shows the driver was alerted multiple times to put his hands on the wheel and the data shows he did not do that.

My car has automated driver features. I know the areas where it has issues and I simply take over in those areas.

Looking at all the data, I can't come to a conclusion other than that the driver wanted his car to crash.

One question from looking at the raw car data, why does the raw data from the vehicle show a speed of 179 MPH about 30 minutes before the crash?


6 people like this
Posted by Moral of the Story
a resident of Community Center
on Feb 14, 2020 at 8:23 am

The only safe autopilot cars
these days are WAYMOS.

They go so SLOW & the driver/operators so overcautious that even an ant wouldn't get run over.

On the other hand, a WAYMO is ideally designed for lousy and/or easily distracted driver.


Like this comment
Posted by Anon
a resident of Another Palo Alto neighborhood
on Feb 14, 2020 at 9:14 am

Posted by Sketchy, a resident of Palo Alto High School

>> One question from looking at the raw car data, why does the raw data from the vehicle show a speed of 179 MPH about 30 minutes before the crash?

What is the source for that information?


Like this comment
Posted by Data
a resident of Palo Alto Hills
on Feb 15, 2020 at 8:11 am

The source looks like it is here:

Web Link

Then click on #6 "Tesla Driver Assistance System Data"

Then click "View" (it is an Excel document showing data the car reported)

Then look at the "Vehicle Speed (mph)" column


Don't miss out on the discussion!
Sign up to be notified of new comments on this topic.

Email:


Post a comment

Posting an item on Town Square is simple and requires no registration. Just complete this form and hit "submit" and your topic will appear online. Please be respectful and truthful in your postings so Town Square will continue to be a thoughtful gathering place for sharing community information and opinion. All postings are subject to our TERMS OF USE, and may be deleted if deemed inappropriate by our staff.

We prefer that you use your real name, but you may use any "member" name you wish.

Name: *

Select your neighborhood or school community: * Not sure?

Comment: *

Verification code: *
Enter the verification code exactly as shown, using capital and lowercase letters, in the multi-colored box.

*Required Fields


Stay informed

Get daily headlines sent straight to your inbox.

Los Altos's State of Mind opening NYC-inspired pizza shop in Palo Alto
By Elena Kadvany | 16 comments | 8,680 views

Wait, wait – we’re working on it
By Diana Diamond | 20 comments | 2,831 views

Premarital and Couples: Here Be Dragons!
By Chandrama Anderson | 0 comments | 1,732 views

Flying: How to lower your impact
By Sherry Listgarten | 6 comments | 1,666 views

Goodbye toy stores
By Cheryl Bac | 12 comments | 1,514 views

 

Short story writers wanted!

The 34th Annual Palo Alto Weekly Short Story Contest is now accepting entries for Adult, Young Adult and Teen categories. Send us your short story (2,500 words or less) and entry form by March 27, 2020. First, Second and Third Place prizes awarded in each category. Sponsored by Kepler's Books, Linden Tree Books and Bell's Books.

Contest Details