NTSB report claims Tesla Model X sped up with Autopilot engaged before fatal March crash

Polycount

Posts: 3,017   +590
Staff
In context: Tesla has drawn quite a bit of heat lately due to numerous crashes that have occurred with Autopilot engaged. One particularly unfortunate incident happened in March when a Model X crashed into a highway barrier, killing driver Walter Huang. Today, the National Transportation Safety Board released the preliminary results of their investigation into the incident.

Tesla may finally be getting its Model 3 production issues under control, but factory problems are far from the only headaches the company has dealt with lately.

In March, a Tesla Model X with Autopilot engaged collided with a highway barrier, killing driver Walter Huang. Shortly after the tragic incident, the National Transportation Safety Board (NTSB) launched a formal investigation into the matter to determine how much fault, if any, lied with Tesla's Autopilot system.

Much to the NTSB's chagrin, Tesla was quick to unveil the details of their own private findings. The company claimed Huang's hands were not on the wheel for at least 6 seconds prior to the crash, despite the vehicle's repeated audible warnings.

While that claim has been confirmed by a preliminary report released by the NTSB today, it appears that there's more to the story than meets the eye. In their report, the NTSB says Huang's Autopilot system sped up to 70.8 mph (from 62 mph) three seconds prior to the crash, rather than slowing down or coming to a stop.

Indeed, the NTSB claims the vehicle didn't even attempt to break or evade the barrier it struck.

When TechCrunch asked Tesla to comment on the matter, a spokesperson declined to do so. Instead, the individual pointed the outlet to a blog post the company penned in March.

"Tesla Autopilot does not prevent all accidents – such a standard would be impossible – but it makes them much less likely to occur," the blog post reads. "It unequivocally makes the world safer for the vehicle occupants, pedestrians and cyclists."

Permalink to story.

 
Musk's mistake is launching a technology well ahead of its maturity while other firms are quietly (but diligently) keeping their stuff out of the mainstream.

In 10 years (if not sooner), the error of this is going to be evident when Mercedes-Benz, Audi, BMW, Ford, Toyota, and Honda all come out with some form of "Autopilot" as an option in all of their above-base vehicles. They will brand them as safe, reliable systems that have triple the testing of Musk's offerings with none of the fatalities.

"Autopilot" is what's going to kill Tesla because the brand is going to be recognized as a death trap.

And before any Proud Tesla Owners(TM) whine about autopilot being misused or "it's just one feature!!!" or whatever else...Autopilot has created a brand identity problem for the company and Musk's antics are hurting that problem more than helping (recall his recent meltdown). Tesla is becoming synonymous with traffic fatalities.

This has nothing to do with the actual numbers, but the fact that the words "Tesla" and "death" are becoming synonyms in the media and casual conversation about the cars.

The only people who don't see this are the fanboys.
 
I am not surprised by this. The final report will be interesting.

@Polycount "how much blame, if any, lay with Tesla's autopilot system"
 
Tesla owners should take a clue from airline pilots, autopilot does not drive the car, it's an assistance tool.

Patrick Smith is an active airline pilot who has been flying commercially since 1990. He told CNBC that the traveling public tends to imagine a pilot reclining back, reading a newspaper, while the autopilot does all the work. The reality is actually quite different, he said.

"The auto flying system does not fly the airplane," he said. "The pilots fly the plane through the automation."
https://www.cnbc.com/2015/03/26/autopilot-what-the-system-can-and-cant-do.html
 
So basically "Hold on to the wheel at all times for dear life because the computer may decide to kill you and/or others"? Got it. Since all hardware/software is not foolproof and will fail, this will always be the case. Please, keep these away for the public's safety.
 
Musk's mistake is launching a technology well ahead of its maturity while other firms are quietly (but diligently) keeping their stuff out of the mainstream.

In 10 years (if not sooner), the error of this is going to be evident when Mercedes-Benz, Audi, BMW, Ford, Toyota, and Honda all come out with some form of "Autopilot" as an option in all of their above-base vehicles. They will brand them as safe, reliable systems that have triple the testing of Musk's offerings with none of the fatalities.

"Autopilot" is what's going to kill Tesla because the brand is going to be recognized as a death trap.

And before any Proud Tesla Owners(TM) whine about autopilot being misused or "it's just one feature!!!" or whatever else...Autopilot has created a brand identity problem for the company and Musk's antics are hurting that problem more than helping (recall his recent meltdown). Tesla is becoming synonymous with traffic fatalities.

This has nothing to do with the actual numbers, but the fact that the words "Tesla" and "death" are becoming synonyms in the media and casual conversation about the cars.

The only people who don't see this are the fanboys.
Agreed.

Toyota, for one, already has some of the components of this in current vehicles, e.g., lane keeping, radar controlled cruise control, collision avoidance, and perhaps more I am not aware of. My bet is that they are gathering data on how well these components operate in an effort to improve them over time. At some point, that data might be very valuable and perhaps fully integrated into a future "autopilot" system that performs far better than the current Tesla system.

I find it a bit ironic that I joked in another thread about the car warning the driver it was going to kill them if they did not take the wheel, and now we hear that they car sped up as it approached the barrier. Who knows what was going on in the driver's mind, but autopilot speeding up as it approaches a fixed barrier? Crazy to say the least.
 
"the NTSB claims the vehicle didn't even attempt to break or evade the barrier it struck."
grammar police...should be brake ;)
 
Back