NTSB determines that Autopilot was not engaged at time of fatal Tesla crash

midian182

Posts: 9,763   +121
Staff member
In brief: An investigation into a fatal Tesla car crash in 2021 that many suspected was the fault of the car's Autopilot system has concluded that the feature could not have been activated at the time. National Transportation Safety Board (NTSB) investigators wrote that the driver was under the influence of drugs and alcohol and had been operating the car up until the moment of the collision.

On April 17, 2021, a 2019 Tesla Model S P100D missed a turn, hit a tree, and caught fire in Spring, Texas. The two occupants, who were killed in the crash, were found unbuckled in the passenger seat and rear seat, leading to suspicions that the Autopilot had been engaged.

The NTSB investigation found security video footage of Dr. William Varner entering the driver's seat of the car and Everette Talbot entering the passenger seat before the Tesla drove away, but first responders to the accident found Varner's body in the backseat.

The report found that Autopilot couldn't have been activated because there were no lane lines—a requirement for Tesla's Traffic-Aware Cruise Control—on the road where the accident occurred. Moreover, the system would not have allowed the vehicle to reach the speeds recorded at the time of the crash. Analysis also showed that both seatbelts were fastened at the time of the collision and the steering wheel was buckled and broken.

The event data recorder in the Model S revealed that five seconds before the crash, the car accelerated from 39 to 67mph in two seconds and traveled 57mph before a full-stop. The airbags were deployed, and the fire started due to a damaged front battery module.

The NTSB blamed the accident on excessive speeds and driver impairment caused by alcohol and two sedating antihistamines.

The NTSB has opened several investigations into crashes involving Tesla vehicles over the years and warned that the naming of the company's "Full Self-Driving Capability" package is misleading and irresponsible.

Yesterday saw Apple co-founder Steve Wozniak accuse Elon Musk and Tesla of "robbing" him and his family over false claims about Tesla's self-driving capabilities.

Permalink to story.

 
On a ICE burnt-out car usually you can still identify the car make/model. But that battery burns with higher flame temperature looks like.
 
"The two occupants, who were killed in the crash, were found unbuckled in the passenger seat and rear seat, leading to suspicions that the Autopilot had been engaged." Also. "Analysis also showed that both seatbelts were fastened at the time of the collision and the steering wheel was buckled and broken." Looks like Musk has also discovered matter-transfer and self-buckling seat belts.
 
On the Tesla website it says:

"Active safety features come standard on all Tesla vehicles made after September 2014 for elevated protection at all times. "

"Automatic Emergency Braking: Detects cars or obstacles that the car may impact and applies the brakes accordingly"
"Lane Departure Avoidance: Applies corrective steering to keep your car in the intended lane"
"Emergency Lane Departure Avoidance: Steers your car back into the driving lane when it detects that your car is departing its lane and there could be a collision"

It seems none of these features protected the occupants of the card. Autopilot or not that's still an issue. Bursting into flames is also an issue. Situations like this shouldn't happen with as much bragging as Tesla did about how safe their cars are.

 
It seems none of these features protected the occupants of the card. Autopilot or not that's still an issue. Bursting into flames is also an issue. Situations like this shouldn't happen with as much bragging as Tesla did about how safe their cars are.
I'm not a Tesla fan -- at all -- but the trick on those emergency braking systems is they must not have false positives. Curvy road, you can't have it decide you're going to hit the guard rail and slam on the brakes. It can't pick up leaves hanging over the road and decide to slam on the brakes. I don't expect these systems to stop if you're drunkenly swerving around and aim it at a tree -- the system could safely assume you were intending to drive around the tree. (My friend actually had just that problem with his Tesla -- false positive -- after a software update one day, apparently it decided an overhead bridge was blocking the road and slammed on the brakes in the middle of the highway!)

I'm glad the article makes it clear the AutoPilot would not reach the speeds they were going and etc. One trick Tesla used to use in reports to NHTSA was to report "The AutoPilot was not engaged at the time of the accident" -- not mentioning the AutoPilot was not engaged *at the time* because it shut itself down a few seconds earlier.
 
I'm glad the article makes it clear the AutoPilot would not reach the speeds they were going and etc. One trick Tesla used to use in reports to NHTSA was to report "The AutoPilot was not engaged at the time of the accident" -- not mentioning the AutoPilot was not engaged *at the time* because it shut itself down a few seconds earlier.
As I see it, that's the thing with "AutoPilot." IMO, if it is going to shut itself down while driving, it should always pull the car onto the shoulder of the road (or at least as off the road that it can get), come to a complete stop, and then refuse to start back up for a reasonable period of time, like 15-minutes or more, especially if no one is in the driver's seat. That would, IMO, discourage unsafe driving practices, and likely contribute to a much higher degree of safety in Tesla's.

But many of the Tesla crashes that have been in the news have resulted from no one in the driver's seat. Perhaps those drivers are doing the rest of society a favor by competing for a Darwin award. However, IMO, anyone who does anything like this lacks any degree of sanity.
 
Last edited:
As I see it, that's the thing with "AutoPilot." IMO, if it is going to shut itself down while driving, it should always pull the car onto the shoulder of the road (or at least as off the road that it can get), come to a complete stop, and then refuse to start back up for a reasonable period of time, like 15-minutes or more, especially if no one is in the driver's seat. That would, IMO, discourage unsafe driving practices, and likely contribute to a much higher degree of safety in Tesla's.

But many of the Tesla crashes that have been in the news have resulted from no one in the driver's seat. Perhaps those drivers are doing the rest of society a favor by competing for a Darwin award. However, IMO, anyone who does anything like this lacks any degree of sanity.
As far as I know, it doesn’t turn itself off without driver intervention. It will beep like crazy and have a red hands on steering wheel icon that’ll cover the screen if it needs to turn off. But in any scenario if it doesn’t detect any torque on the steering wheel, it’ll start slowing down, put on its hazard lights, and keep beeping until it comes to a stop. It also won’t activate until the car is turned off and on again. This is a great demonstration of what happens:

Here’s an example of Autopilot pulling over in the news and a much better outcome than this crazy crash: https://electrek.co/2018/01/20/tesl...-s-tells-the-police-the-car-was-on-autopilot/
 
I'm not a Tesla fan -- at all -- but the trick on those emergency braking systems is they must not have false positives. Curvy road, you can't have it decide you're going to hit the guard rail and slam on the brakes. It can't pick up leaves hanging over the road and decide to slam on the brakes. I don't expect these systems to stop if you're drunkenly swerving around and aim it at a tree -- the system could safely assume you were intending to drive around the tree. (My friend actually had just that problem with his Tesla -- false positive -- after a software update one day, apparently it decided an overhead bridge was blocking the road and slammed on the brakes in the middle of the highway!)

I'm glad the article makes it clear the AutoPilot would not reach the speeds they were going and etc. One trick Tesla used to use in reports to NHTSA was to report "The AutoPilot was not engaged at the time of the accident" -- not mentioning the AutoPilot was not engaged *at the time* because it shut itself down a few seconds earlier.
The systems I listed should have kept the car on the road and should have applied the brakes (never said anything about coming to a full stop) before the impact with the tree. The speeds they were going was 57mph when they hit the tree with the fastest they went was 62mph. Those aren't crazy speeds. Those systems have nothing to do with Autopilot since they are on all the time.
 
Back