NHTSA report says driver in fatal Tesla Autopilot crash had seven seconds to take action

Shawn Knight

Posts: 15,254   +192
Staff member

The National Highway Traffic Safety Administration (NHTSA) last June opened an investigation into an accident involving a Tesla Model S in which the sole occupant of the vehicle was killed. On Thursday, the NHTSA said it did not find a safety-related defect and closed the investigation.

On May 7, 2016, Joshua Brown’s Tesla Model S crashed into the side of a tractor trailer as it was crossing the road. Data from the vehicle showed that it was being operated in Autopilot mode although the Automatic Emergency Braking (AEB) system never provided a warning and didn’t kick in.

What’s more, the driver never attempted to brake, steer or otherwise avoid the collision. The last recorded action from the driver took place less than two minutes before the crash when he increased the cruise control speed to 74 mph. Conditions that day were clear and dry.

NHTSA’s crash investigation team determined that the tractor trailer should have been visible to the driver for at least seven seconds before impact.

In conclusion, the NHTSA said it was unable to find a safety-related defect and that further examination of the issue doesn’t appear to be warranted. As such, the investigation has closed.

A spokesperson for Tesla told The Verge that the safety of their customers comes first and that they appreciate both the thoroughness of the NHTSA report and its conclusion.

Permalink to story.

 
I'm sorry... but BS. If we are supposed to have these "auto pilot" capabilities, or autonomous vehicles, they should absolutely be able to detect a vehicle crossing paths with the course of the Tesla. Sure sure Tesla has warnings that the driver should always watch the road even when "auto pilot" is engaged and they driver should always be ready to take control... so the driver is at fault in this case, but that doesn't mean that this isn't a safety related issue with the Tesla auto pilot system. It probably doesn't need to be investigated anymore, because they know the Tesla computers never detected the semi tractor trailer. Tesla needs (and hopefully already is) to find a way to prevent this in the future. If we are to have any faith at all in the future of autonomous vehicles.

To finalize my opinion, the driver AND Tesla are at fault here. The driver should have been doing what the Tesla instructions said to do, and that is to watch the road and be ready to take control. AND, Tesla should acknowledge this was a fault in their system design and it needs to be corrected.

/rant
 
I'm sorry... but BS. If we are supposed to have these "auto pilot" capabilities, or autonomous vehicles, they should absolutely be able to detect a vehicle crossing paths with the course of the Tesla. Sure sure Tesla has warnings that the driver should always watch the road even when "auto pilot" is engaged and they driver should always be ready to take control... so the driver is at fault in this case, but that doesn't mean that this isn't a safety related issue with the Tesla auto pilot system. It probably doesn't need to be investigated anymore, because they know the Tesla computers never detected the semi tractor trailer. Tesla needs (and hopefully already is) to find a way to prevent this in the future. If we are to have any faith at all in the future of autonomous vehicles.

To finalize my opinion, the driver AND Tesla are at fault here. The driver should have been doing what the Tesla instructions said to do, and that is to watch the road and be ready to take control. AND, Tesla should acknowledge this was a fault in their system design and it needs to be corrected.

/rant

Tesla has made it abundantly clear that auto pilot is just a fancy driver assistance package not fully fledged autonomous driving. Anyone who fails to recognize this shouldn't even be in the driver's seat of a Tesla.
 
I'm sorry... but BS. If we are supposed to have these "auto pilot" capabilities, or autonomous vehicles, they should absolutely be able to detect a vehicle crossing paths with the course of the Tesla. Sure sure Tesla has warnings that the driver should always watch the road even when "auto pilot" is engaged and they driver should always be ready to take control... so the driver is at fault in this case, but that doesn't mean that this isn't a safety related issue with the Tesla auto pilot system. It probably doesn't need to be investigated anymore, because they know the Tesla computers never detected the semi tractor trailer. Tesla needs (and hopefully already is) to find a way to prevent this in the future. If we are to have any faith at all in the future of autonomous vehicles.

To finalize my opinion, the driver AND Tesla are at fault here. The driver should have been doing what the Tesla instructions said to do, and that is to watch the road and be ready to take control. AND, Tesla should acknowledge this was a fault in their system design and it needs to be corrected.

/rant

Tesla has made it abundantly clear that auto pilot is just a fancy driver assistance package not fully fledged autonomous driving. Anyone who fails to recognize this shouldn't even be in the driver's seat of a Tesla.
This.

You cannot put this **** on and go to sleep we are not there yet and those that do well you die!

The road is no place to be messing around one mistake and its could all be over for you.
 
I'm sorry... but BS. If we are supposed to have these "auto pilot" capabilities, or autonomous vehicles, they should absolutely be able to detect a vehicle crossing paths with the course of the Tesla. Sure sure Tesla has warnings that the driver should always watch the road even when "auto pilot" is engaged and they driver should always be ready to take control... so the driver is at fault in this case, but that doesn't mean that this isn't a safety related issue with the Tesla auto pilot system. It probably doesn't need to be investigated anymore, because they know the Tesla computers never detected the semi tractor trailer. Tesla needs (and hopefully already is) to find a way to prevent this in the future. If we are to have any faith at all in the future of autonomous vehicles.

To finalize my opinion, the driver AND Tesla are at fault here. The driver should have been doing what the Tesla instructions said to do, and that is to watch the road and be ready to take control. AND, Tesla should acknowledge this was a fault in their system design and it needs to be corrected.

/rant

So your rant is that it's the driver and the assisted driver technology at fault here? So basically you're not really saying anything at all and you're all worked up about it.

Driverless cars are coming. The technology is incubating right now and manufacturers all over the place are racing to market with this technology. The technology is sound and will be an order of magnitude safer than human drivers. The Wall Street Journal reported that driverless cars will eliminated more than 90% of traffic fatalities in the first 50 years for the drivers of these vehicles. The net result of driverless car technology is that the roads will be many times more safer.

Yet you sit and rant about sht you don't even really understand.
 
I'm sorry... but BS. If we are supposed to have these "auto pilot" capabilities, or autonomous vehicles, they should absolutely be able to detect a vehicle crossing paths with the course of the Tesla. Sure sure Tesla has warnings that the driver should always watch the road even when "auto pilot" is engaged and they driver should always be ready to take control... so the driver is at fault in this case, but that doesn't mean that this isn't a safety related issue with the Tesla auto pilot system. It probably doesn't need to be investigated anymore, because they know the Tesla computers never detected the semi tractor trailer. Tesla needs (and hopefully already is) to find a way to prevent this in the future. If we are to have any faith at all in the future of autonomous vehicles.

To finalize my opinion, the driver AND Tesla are at fault here. The driver should have been doing what the Tesla instructions said to do, and that is to watch the road and be ready to take control. AND, Tesla should acknowledge this was a fault in their system design and it needs to be corrected.

/rant

Rant all you want. A Darwin award should be given to people who can't drive defensively. I hope I won't meet you in the road.
 
I'm sorry... but BS. If we are supposed to have these "auto pilot" capabilities, or autonomous vehicles, they should absolutely be able to detect a vehicle crossing paths with the course of the Tesla. Sure sure Tesla has warnings that the driver should always watch the road even when "auto pilot" is engaged and they driver should always be ready to take control... so the driver is at fault in this case, but that doesn't mean that this isn't a safety related issue with the Tesla auto pilot system. It probably doesn't need to be investigated anymore, because they know the Tesla computers never detected the semi tractor trailer. Tesla needs (and hopefully already is) to find a way to prevent this in the future. If we are to have any faith at all in the future of autonomous vehicles.

To finalize my opinion, the driver AND Tesla are at fault here. The driver should have been doing what the Tesla instructions said to do, and that is to watch the road and be ready to take control. AND, Tesla should acknowledge this was a fault in their system design and it needs to be corrected.

/rant

Nah. Musk should hold a press conference where he awards the deceased driver a platinum Darwin Award, because this was platinum-level user error.
 
I'm sorry... but BS. If we are supposed to have these "auto pilot" capabilities, or autonomous vehicles, they should absolutely be able to detect a vehicle crossing paths with the course of the Tesla. Sure sure Tesla has warnings that the driver should always watch the road even when "auto pilot" is engaged and they driver should always be ready to take control... so the driver is at fault in this case, but that doesn't mean that this isn't a safety related issue with the Tesla auto pilot system. It probably doesn't need to be investigated anymore, because they know the Tesla computers never detected the semi tractor trailer. Tesla needs (and hopefully already is) to find a way to prevent this in the future. If we are to have any faith at all in the future of autonomous vehicles.

To finalize my opinion, the driver AND Tesla are at fault here. The driver should have been doing what the Tesla instructions said to do, and that is to watch the road and be ready to take control. AND, Tesla should acknowledge this was a fault in their system design and it needs to be corrected.

/rant

So your rant is that it's the driver and the assisted driver technology at fault here? So basically you're not really saying anything at all and you're all worked up about it.

Driverless cars are coming. The technology is incubating right now and manufacturers all over the place are racing to market with this technology. The technology is sound and will be an order of magnitude safer than human drivers. The Wall Street Journal reported that driverless cars will eliminated more than 90% of traffic fatalities in the first 50 years for the drivers of these vehicles. The net result of driverless car technology is that the roads will be many times more safer.

Yet you sit and rant about sht you don't even really understand.

That's just the tip of the iceberg. Aside from not having to worry about people driving impaired (drunk, emotionally, or otherwise) it also will demolish the cost of insurance and could save certain people allot of money. When we do go all driver-less, if car insurance still exists it will be much cheaper than what we pay now. People who previously needed to own a car and pay insurance on it can simply open up an app and hail a driver-less car to take them anywhere. The app can easily have a feature where it picks you up on a schedule everyday in the vehicle of your choice.

This also means that anyone in the family can get a ride, no license required. No longer are people restricted by their vehicles.
 
IAMTHESTIG is saying that both the Tesla AND the driver are at fault. I think that's a pretty fair summary of the case isn't it? Surely the 'auto-pilot' feature should've been able to detect a crossing vehicle and apply the brakes - especially if, as the account reports, the driver may have had 7 seconds to detect it too? The driver was clearly negligent and was far too over-reliant on the 'auto-pilot' feature but this technology needs to mature a lot more before it's made available to the public. Driving in a regular car on the road demonstrates just how dumb and wreckless people can be.
 
This was clearly a driver error. The car he was driving was not autonomous

Like it or not driverless cars are the future. It will take about another decade to become more common, but it will happen really fast.
 
It sounds like Tesla is using its customers to beta-test the future of driver-less cars -- releasing ever more advanced products based on customer experiences (or customer accidents). But, if the car is not yet fully autonomous, then Tesla is just putting its customers in harm's way. A driver needs to be fully engaged with the drive if they are to remain safe. If you tell a driver that they can take their hands off the wheel and let the car drive itself, then the driver is just going to take out their phone and lose themselves in the online world while their car makes a beeline for the nearest (wall, vehicle, pedestrian).
 
Heck if a person can sue Apple because the other person that hit them was texting, then obviously it must be the car's fault and not the driver. (Note the sarcasm in case you couldn't tell)
 
It sounds like Tesla is using its customers to beta-test the future of driver-less cars -- releasing ever more advanced products based on customer experiences (or customer accidents). But, if the car is not yet fully autonomous, then Tesla is just putting its customers in harm's way. A driver needs to be fully engaged with the drive if they are to remain safe. If you tell a driver that they can take their hands off the wheel and let the car drive itself, then the driver is just going to take out their phone and lose themselves in the online world while their car makes a beeline for the nearest (wall, vehicle, pedestrian).

When you turn on the feature, the car tells you every single time: "your hands should remain on the wheels at all times". Customers are putting themselves in harm's way... not Tesla. Tesla put the technology out there, now people have to use it properly (instead of blaming their own careless acts on the company).

It's like blaming a gun manufacturer, because of a home incident where a child shot someone. The one to blame here is not the "gun technology", but the parent's negligence for not storing it safely in the first place(not using the technology correctly).
 
I'm not against getting rid of people that refuse to take responsibility for their own driving, but smashing up cars to do it is extremely inefficient and puts me in harms way as well.
 
Some of you aren't remembering that right after this happened, Tesla fired the company that was providing them with their autopilot software and went a new direction because the software should've seen the truck and couldn't. This accident is 100% the fault of the careless driver who forgot he was the driver, but Tesla did immediately work to correct the software issue.
 
It sounds like Tesla is using its customers to beta-test the future of driver-less cars -- releasing ever more advanced products based on customer experiences (or customer accidents). But, if the car is not yet fully autonomous, then Tesla is just putting its customers in harm's way. A driver needs to be fully engaged with the drive if they are to remain safe. If you tell a driver that they can take their hands off the wheel and let the car drive itself, then the driver is just going to take out their phone and lose themselves in the online world while their car makes a beeline for the nearest (wall, vehicle, pedestrian).

Tesla has said time and time again, the car even warns you, to keep your hands on the wheel and pay attention. Their technology is just there to assist the driver and is not yet a fully autonomous solution, no auto maker has that yet. To a certain extend this is beta testing for future driver assist technology, miles need to logged, data collected, there are just far too many things that can and will happen on a road that it's nearly impossible to program a system to account for every last one of them. Do you not find it odd that no other auto manufacturer has attempted such a system? Other similar technology exist but none of them make the claims that people seem to assume Tesla is making, because simply put we are not there yet. And personally I don't believe a fully automated driving experience will ever exist until it's made mandatory and every vehicle on the road has it, because the biggest variable right now are the human drivers and the stupidity of which can never be accounted for. I can only hope this takes another 50 years to happen, I actually enjoy driving, pay attention when I do, right now we have a solution for those who feel driving is a chore and can't be bother to do so in a safe manner, it's called the bus.
 
It sounds like Tesla is using its customers to beta-test the future of driver-less cars -- releasing ever more advanced products based on customer experiences (or customer accidents). But, if the car is not yet fully autonomous, then Tesla is just putting its customers in harm's way. A driver needs to be fully engaged with the drive if they are to remain safe. If you tell a driver that they can take their hands off the wheel and let the car drive itself, then the driver is just going to take out their phone and lose themselves in the online world while their car makes a beeline for the nearest (wall, vehicle, pedestrian).

Tesla has said time and time again, the car even warns you, to keep your hands on the wheel and pay attention. Their technology is just there to assist the driver and is not yet a fully autonomous solution, no auto maker has that yet. To a certain extend this is beta testing for future driver assist technology, miles need to logged, data collected, there are just far too many things that can and will happen on a road that it's nearly impossible to program a system to account for every last one of them. Do you not find it odd that no other auto manufacturer has attempted such a system? Other similar technology exist but none of them make the claims that people seem to assume Tesla is making, because simply put we are not there yet. And personally I don't believe a fully automated driving experience will ever exist until it's made mandatory and every vehicle on the road has it, because the biggest variable right now are the human drivers and the stupidity of which can never be accounted for. I can only hope this takes another 50 years to happen, I actually enjoy driving, pay attention when I do, right now we have a solution for those who feel driving is a chore and can't be bother to do so in a safe manner, it's called the bus.
I wonder, if drivers followed Tesla's instructions/advice to keep their hands on the wheel and be ready to react (aka pay attention) in NON-autopilot vehicles, all the time, would this be such an issue in the first place? The majority of accidents are caused by distracted driving, after all.
 
@Adhmuz -- I like to 'drive' as well, I presume we differ on what constitutes 'driving', as I feel passing time on I-80 in the flyover states for eight hours there-and-back to visit my family 4-ish times a year to be exactly as you described, "a chore".
I hope that autopilot "two-point-oh" and beyond will be able to mitigate some of that drudgery.
RE the bus, if you can suggest a midwest, US motor-coach with seats and space remotely as nice as your average 5 year old, beat half to death Malibu, I'm all over it, lol (I'll even throw in willingness to taxi the last 5-10 miles to and from the extremely convenient bus stations that we load/unload from).
 
Last edited:
I'm sorry... but BS. If we are supposed to have these "auto pilot" capabilities, or autonomous vehicles, they should absolutely be able to detect a vehicle crossing paths with the course of the Tesla. Sure sure Tesla has warnings that the driver should always watch the road even when "auto pilot" is engaged and they driver should always be ready to take control... so the driver is at fault in this case, but that doesn't mean that this isn't a safety related issue with the Tesla auto pilot system. It probably doesn't need to be investigated anymore, because they know the Tesla computers never detected the semi tractor trailer. Tesla needs (and hopefully already is) to find a way to prevent this in the future. If we are to have any faith at all in the future of autonomous vehicles.

To finalize my opinion, the driver AND Tesla are at fault here. The driver should have been doing what the Tesla instructions said to do, and that is to watch the road and be ready to take control. AND, Tesla should acknowledge this was a fault in their system design and it needs to be corrected.

/rant

So your rant is that it's the driver and the assisted driver technology at fault here? So basically you're not really saying anything at all and you're all worked up about it.

Driverless cars are coming. The technology is incubating right now and manufacturers all over the place are racing to market with this technology. The technology is sound and will be an order of magnitude safer than human drivers. The Wall Street Journal reported that driverless cars will eliminated more than 90% of traffic fatalities in the first 50 years for the drivers of these vehicles. The net result of driverless car technology is that the roads will be many times more safer.

Yet you sit and rant about sht you don't even really understand.
It was a rant dude, and I understand perfectly. Let me simplify my rant: this tech isn't ready for people to rely on yet, it appears this guy may have done just that and that is his own fault. This tech isn't ready for people to rely on yet, Tesla knows this and advises drivers of this fact, but considering their goal is clearly fully autonomous driving this demonstrates to Tesla that vehicle detection is an area that needs improvement to prevent this type of accident from occurring.
 
Last edited:
Back