US investigators link Tesla Autopilot to dozens of deaths and almost 1,000 crashes

Daniel Sims

Posts: 1,392   +43
Staff
The big picture: Tesla vehicles are no stranger to controversy, with the company's Autopilot system being a major source of negative press. However, a recent report from US regulators renders the scale of Tesla Autopilot failures in hard numbers. The hundreds of crashes and dozens of deaths were primarily the result of drivers misunderstanding what "Autopilot" really means.

A newly published report from the National Highway Traffic Safety Administration (NHTSA) links Tesla's Autopilot systems to nearly 1,000 crashes from the last few years, over two dozen of them fatal. Most were caused by inattentive drivers who may have falsely believed that the company's driver assistance systems amounted to full-blown self-driving.

The investigation details 956 crashes between January 2018 and August 2023, resulting in well over 100 injuries and dozens of deaths. In many incidents, a crash occurred several seconds after the vehicle's Autopilot system detected an obstruction that it didn't know how to negotiate, ample time for an attentive driver to avoid an accident or minimize the damage suffered.

In one example from last year, a 2022 Model Y in Autopilot hit a minor stepping out of a school bus in North Carolina at "highway speeds." The victim suffered life-threatening injuries, and an examination revealed that an observant driver should have been able to avoid the accident.

Tesla's Autopilot functionality isn't a fully autonomous driving system. Defined as a Level 2 self-driving system, it simultaneously assists with actions like steering, braking, acceleration, lane centering, and adaptive cruise control. However, it still requires the driver to keep their eyes on the road and both hands on the steering wheel.

Also see: The Six Levels of Self-Driving Systems

A 2022 study revealed that many drivers mistakenly believe that existing driver assistance functions like Tesla's Autopilot make cars fully autonomous. Mercedes-Benz recently became the first company to sell Level 3 vehicles in the US, which can become fully autonomous in limited scenarios. Still, the automaker's self-driving vehicles can only be used on certain California highways during the daytime and must be in clear weather.

Inclement weather and challenging road conditions were behind some of the Tesla incidents in the NHTSA report. In 53 cases, the autosteering function failed when the car lost traction. In a further 55 episodes, drivers inadvertently activated manual override by using the steering wheel and almost immediately crashed because they believed Autopilot was still engaged.

Tesla has addressed prior Autopilot flaws with over-the-air updates, but perhaps it should focus on better communicating the feature's limits to users.

Permalink to story:

 
I hate that they are going for the "driver didnt understand" angle. This is an easy concept: When a human as to babysit something with no interaction, it gets REAL boring, REAL fast. Imagine being on the highway, traveling in a straight line, having to look at the road and keep your hands on the wheel but not actually dong ANYTHING.

Prime ticket to snooze town. All this "self driving" systems are just inviting trouble.
 
I hate that they are going for the "driver didnt understand" angle. This is an easy concept: When a human as to babysit something with no interaction, it gets REAL boring, REAL fast. Imagine being on the highway, traveling in a straight line, having to look at the road and keep your hands on the wheel but not actually dong ANYTHING.

Prime ticket to snooze town. All this "self driving" systems are just inviting trouble.

-Yep, supervised "autopilot" seems like more of a hassle and more work than just driving.
 
...That's it? How does that compare to overall vehicle statistics over that time?

That said, "autopilot" will always be the most stupidly misleading name for the system...
 
...That's it? How does that compare to overall vehicle statistics over that time?

Yes, that's the obvious question. There are about 45,000 automotive fatalities per year in the US, so the "dozens" of Tesla autopilot fatalities would make for a very small percentage. Of course, the percentage of all automotive miles that are Tesla with Autopilot enabled is also going to be small, but I don't even have a ballpark of what that percentage is.

There's an obvious follow up even from that first question, which is, for a typical "driver like me", how does autopilot compare? Of the total automotive fatalities, I'd assume a disproportionate number are associated with drunk driving, inexperienced drivers, bad conditions, and other factors that don't apply to me.

Just going by pure intuition, there might be a "worst case" type of driver you'd wish would enable autopilot every time they were driving, or especially when they're driving drunk home from the bar again at 3 times the legal limit. But I'm not sure the odds improve in the same way when you're talking about drivers on the safer end of the spectrum.
 
Anything to make money hey Elon?

Many more people will be injured or die because of tesla's and its customers continuing negligence.
 
Simply put, Tesla should be forced to withdraw the name and the product until they can prove they have it right. Otherwise Musk will continue to needlessly cost lives and property damage all in the name of his profits ....
 
I myself applaud Tesla, they are allowing Darwin to take care of the derps we don't really want in this world.
 
There are more than 5 million auto accidents per year in the US, which means there's been at least 20 million such "in the last few years". This is just another anti-Musk hit piece.
 
I hate that they are going for the "driver didnt understand" angle. This is an easy concept: When a human as to babysit something with no interaction, it gets REAL boring, REAL fast. Imagine being on the highway, traveling in a straight line, having to look at the road and keep your hands on the wheel but not actually dong ANYTHING.

Prime ticket to snooze town. All this "self driving" systems are just inviting trouble.
I don't see myself using anything less than Level 3 ADAS. I could see Level 3 being useful on I-5. But, quite frankly, anything less than Level 3 sounds more like "You're sitting in the passenger seat while a deaf mute student driver controls the car" only worse, because you honestly don't know what it's going to do. Engaging an ADAS of less than Level 3 would be more anxiety inducing than relaxing for me.
In short, I don't want nor do I need this.
 
There are about 45,000 automotive fatalities per year in the US, so the "dozens" of Tesla autopilot fatalities would make for a very small percentage.
My wife was in stop-and-go traffic behind the Autopilot-involved collision featured in link below and was forced to witness most of the aftermath. In this case, the failure was two-fold: the driver was 'trusting' his Autopilot while preoccupied w/his phone AND Autopilot got 'confused' by the motorcycle in front of it and suddenly lurched forward and rolled on top of the bike/rider, killing him. I agree vast majority of drivers exert only a fraction of the attention they should while driving, but this system is simply not ready for consumer use.

https://www.heraldnet.com/news/tesla-driver-on-autopilot-caused-fatal-highway-522-crash-police-say/
 
My wife was in stop-and-go traffic behind the Autopilot-involved collision featured in link below and was forced to witness most of the aftermath. In this case, the failure was two-fold: the driver was 'trusting' his Autopilot while preoccupied w/his phone AND Autopilot got 'confused' by the motorcycle in front of it and suddenly lurched forward and rolled on top of the bike/rider,...
Frankly, I don't buy it -- especially as you're using the exact language from the article, including the excuse the driver gave the responding trooper. At this point, there's no proof Autopilot was even engaged, other than the word of someone who's committed felony manslaughter. And for a vehicle already traveling at speed down a highway, to describe it as "lurching forward" is fishy at best. Did the driver, not seeing the motorcyclist, attempt to accelerate?

In the majority of past cases the media has chosen to report, the subsequent investigation finds that either Autopilot wasn't engaged at all, or the driver chose to override it, with serious consequences. Yes, the feature isn't perfect, but we'll need more than the word of the driver to make that determination.
 
Last edited:
Frankly, I don't believe you -- especially as you're using the exact language from the article, including the excuse the driver gave the responding trooper. At this point, there's no proof Autopilot was even engaged, other than the word of someone who's committed felony manslaughter. And for a vehicle already traveling at speed down a highway, to describe it as "lurching forward" is fishy at best. Did the driver, not seeing the motorcyclist, attempt to accelerate?

In the majority of past cases the media has chosen to report, the subsequent investigation finds that either Autopilot wasn't engaged at all, or the driver chose to override it, with serious consequences. Yes, the feature isn't perfect, but we'll need more than the word of the driver to make that determination.
Since you need me to repeat myself: it was stop-and-go traffic (please read carefully before replying). Long back-up of painfully slow-moving cars typical of this stretch of highway every afternoon/evening. Obviously, I'd share your sentiment if they were traveling at highway speed, but this wasn't the case. Anyway, Autopilot's issues with motorcycles is well-documented online.

Frankly, it's irrelevant what you believe. My assertions come from insight gained by personal relationship w/responding EMT staff, and of course the eyewitness I'm married to (and didn't lie about, but indeed knock yourself out with your 'belief'). CNBC and MSN also have articles on this incident, but you're right - the guy could have been lying to police about being on his phone with Autopilot engaged.

If Autopilot was engaged in this case, I fault the system a bit more than the driver on this one, given how fast this all happened (again, according to eye-witness so take that as far as your 'belief' allows). If Autopilot wasn't actually engaged, it must have been a hell of a sneeze (or jump scare on whatever he was watching on his phone) that caused him to jam on the accelerator hard and long enough to knock over and roll onto the motorcyclist (even in a super-torquey Tesla). The car is in police custody and they are working toward access to data collection to prove either way.
 
If Autopilot was engaged in this case, I fault the system a bit more than the driver ...
First you stated that Autopilot "got confused", now you admit you're not even sure it was engaged? I'll leave my conclusions to the investigation, but as for Autopilot having "well documented" issues, I'll leave you with some actual data:

"...The National Highway Traffic Safety Administration closed a long-standing investigation into Tesla’s Autopilot system after reviewing 956 crashes ... In just over half (489), either the system was not engaged, the other driver was at fault, or there was insufficient data. Of the 467 crashes remaining: 145 involved the vehicle leaving the roadway due to low traction conditions (snow, ice, or water), in 111 cases, the driver invertedly disengaged Autopilot, and 211 where "Autopilot struck a vehicle or obstacle with enough time for an attentive driver to respond and avoid" the crash....
 
Last edited:
First you stated that Autopilot "got confused", now you admit you're not even sure if it was engaged or not? I'll leave any conclusions to the investigation, but as for Autopilot having "well documented" issues, I'll leave you with some actual data:

The National Highway Traffic Safety Administration closed a long-standing investigation into Tesla’s Autopilot system after reviewing 956 crashes ... In just over half (489), either the system was not engaged, the other driver was at fault, or there was insufficient data. Of the 467 crashes remaining: 145 involved the vehicle leaving the roadway due to low traction conditions (snow, ice, or water), in 111 cases, the driver invertedly disengaged Autopilot, and 211 where "Autopilot struck a vehicle or obstacle with enough time for an attentive driver to respond and avoid" the crash.
Duh, no it hasn't been proven Autopilot was engaged beyond a shadow (hence the word "If"), and I was presenting the two options, not 'changing a story'. Wow, man - carefully read all the words for full context.
That 'actual data' you copy/pasted is actual statistics, yes - but isn't at all relevant to the point I'm referring to where Autopilot still has ongoing issues with smaller or randomly moving objects around it. Tesla is plugging holes right and left (for the better), but it's not nearly perfect.
 
.I was presenting the two options, not 'changing a story'. Wow, man - carefully read all the words for full context.
Shruig, I can't read words before you post them. Your first post claimed definitively that Autopilot was at fault. When challenged, you backtracked.

Automobile accidents have already killed more than 60 million people worldwide -- more than the population of most countries. The only way that will ever change is autonomous vehicles. Feverishly spreading disinformation on the tech is ill advised.

That 'actual data' you copy/pasted is actual statistics, yes - but isn't at all relevant
It certainly is relevant, as those accidents investigated include both auto *and* motorcycle accidents -- all of which could have been avoided had the driver used the system properly.
 
Last edited:
Your first post claimed definitively that Autopilot was at fault. When challenged, you backtracked.
Granted, if we're anally arguing specific semantics just to win an Internet debate, I did say that w/out learning results of data collection (which I personally know is still pending), however the constellation of symptoms of which I have more than the average article-reader's knowledge on make my assertion a notably educated one. (Dramatic analogy: If I come home to smoke pouring out of every window and rooftop of my house, I don't wonder if my vinyl collection inside the house is melted.)
*Laughingly wondering if you're backtracking on your initial accusation of me being a liar.

Automobile accidents have already killed more than 60 million people worldwide -- more than the population of most countries. The only way that will ever change is autonomous vehicles.
This is another unrelated tangent to the issue between the Tesla and the motorcycle, however one I 110% agree on. As I stated in a previous reply, I have minimal faith that the individuals driving around me are paying enough attention to operating their vehicles. When teaching my kids to drive, my mantra was always 'expect every single vehicle or living thing around you to do something unexpected at any moment and be prepared to react'.

Further off on that tangent, in the interim before autonomous driving is fully ready in coming decades, the US should embrace more solutions to reduce accidents such as greatly increase the barriers to entry to becoming a licensed driver (e.g. Germany as a prime benchmark of where this is much better), retests, stiffer penalties for violators, and better public transportation in most areas. Driver's Ed in US these days basically shows students what the road signs mean and spends very minimal class or field time actually teaching students how to actually handle a vehicle, nor respect its weight and ability to wreak havoc. We're generally releasing drivers out into the wild with an entitled mentality where operating a motor vehicle is a right, not an earned privilege. This is being made worse by people's increased reliance/trust in driver's aids (all car brands), lessening attentiveness during their travels.

As a recreational driver of high-performance cars on closed tracks, I knew the first priority for my kids was to send them through multi-day advance driving courses on closed tracks along with classroom time covering everything generic driver's ed doesn't, before and after they got their licenses. It's expensive but worth every penny. After about ten years, my eldest's only incident so far was backing into another car in parking lot that was also backing up. 😳

Feverishly spreading disinformation on the tech is ill advised.
You have got to be a Tesla owner (or maybe a stockholder)? No one else would be so quick to put their head in the sand so deeply. 🤷‍♂️ So, in your 'expertise', Autopilot has no faults currently? Is that the 'disinformation' I'm spreading by not agreeing that the current tech is ready for broad human use? You seem to believe it is?

all of which could have been avoided had the driver used the system properly.
"All" - now THERE'S some disinformation. lol
*Being a blind apologist for the current state of autonomous tech is also firmly in the category of disinformation.
 
Granted, if we're anally arguing specific semantics just to win an Internet debate, I did say that. *Laughingly wondering if you're backtracking on your initial accusation of me being a liar.
It's not "arguing semantics" to note that your statement was wrong. And since you've retracted it, there are only two possibilities: you were either (a) lying or (b) simply zealously overstating your case. I didn't choose one over the other.

w/out learning results of data collection (which I personally know is still pending), however the constellation of symptoms of which I have more than the average article-reader's knowledge on make my assertion a notably educated one.
Appeal to authority fallacy noted. Where's your evidence, however, other than "trust me! I'm smart!"

"All" - now THERE'S some disinformation. lol
Disinformation? I gave you the results of the official US NHTSA investigation. Of all reported accidents investigated, they failed to find even one single case where Autopilot was definitively at fault. I'm sorry this fact disturbs you so much, but there it is.

The only thing the NHTSA dinged Tesla for was not more closely monitoring drivers to ensure they were actually paying attention while Autopilot was engaged. Tesla has since responded by adding more driver-attention monitoring to the system. But, to be fair, shouldn't **all** vehicles monitor their drivers to see that they're properly watching the road?

You have got to be a Tesla owner (or maybe a stockholder)?
No, I'm simply a fan of the truth. Hopefully I'll convert you to that cause as well.
 
Back