Tesla Model S hits police SUV while in Autopilot mode

midian182

Posts: 9,659   +121
Staff member
What just happened? There’s been yet another crash involving a Tesla vehicle that was in Autopilot mode. This time, a Model S driver hit a parked police SUV in Laguna Beach, California. While the driver of the vehicle sustained minor injuries, the cruiser was unoccupied at the time of the accident.

Sgt Jim Cota, the public information officer for the Laguna Beach police department who tweeted about the incident, told The Guardian that the SUV was a “complete total loss,” and that the Tesla’s front end was “pretty beat up.”

“It [the SUV] was mangled up pretty good. It took out the whole back end and halfway through the center part of the vehicle. There’s axle damage. It wouldn’t be worth repairing,” he said.

This is the latest crash involving a Tesla where the Autopilot feature was engaged, and marks the third time this year that one of the vehicles has hit a stationary emergency vehicle while in Autopilot mode.

A driver in China is thought to have been the first person killed while using Autopilot, back in January 2016. In June that same year, Joshua Brown was the first person in the US to die while using the feature.

A Model X hit a highway divider while in Autopilot mode in March, killing driver Walter Huang. Tesla said its own investigation showed Huang kept his hands off the wheel despite the vehicle’s warnings. NHTSA and the National Transportation Safety Board continue to investigate the case.

Last week, Tesla settled a class-action lawsuit brought by six Model S and Model X owners who alleged that the Autopilot system was “essentially unusable and demonstrably dangerous.” Tesla's settlement didn’t mention the safety allegations but was instead paid out to compensate owners over the delayed Autopilot updates.

Tesla reiterated that the Autopilot feature requires drivers to keep their hands on the wheel and is designed for use on highways that have a center divider and clear lane markings.

The Autopilot system warns drivers who don't hold on to the wheel. Ignoring this will eventually see the feature disabled and can even cause the vehicle to pull up as a safety precaution. But some owners continue to leave the driver’s seat when it is activated, including a UK man who was caught on video sitting in the passenger seat while his Tesla did the driving.

Permalink to story.

 
I wonder how many human controlled vehicles crashed into a stationary vehicle that day...

I'm starting to think that Captain Cranky has been right all along and that Musk sells to the least intelligent people possible:

The Autopilot system warns drivers who keep their hands off the wheel. Ignoring this will eventually see the feature disabled and can even cause the vehicle to pull up as a safety precaution. But some owners continue to leave the driver’s seat when it is activated, including a UK man who was caught on video sitting in the passenger seat while his Tesla did the driving.

If this keeps up, you'll be able to use any handicap parking space in the country just by owning a Tesla.
 
I've come to conclusion about the mindset of Tesla owners that matches exactly that of Apple customers.

I think Apple should by Tesla, to stockpile all its faithful undiscourageable idi0ts.
I'm sorry, but why are you blaming them for what is "allegedly" human stupidity? We still don't know yet for this one, but until now it's been user error that caused the past accidents. Who seriously leaves the driver's seat? O_o
 
The most optomistic must hope for a systems failure as cause. The PRIMARY rule of navigation is 'avoid collision'. A parked vehicle should be safe from attack by so called 'autopilot' even if said vehicle is parked in the middle of the roadway, stradling three lanes, etc. It's now 3 strikes. It seems that the software/systems are unreliable in a primary way.
 
The most optomistic must hope for a systems failure as cause. The PRIMARY rule of navigation is 'avoid collision'. A parked vehicle should be safe from attack by so called 'autopilot' even if said vehicle is parked in the middle of the roadway, stradling three lanes, etc. It's now 3 strikes. It seems that the software/systems are unreliable in a primary way.
Musk believed that Tesla's radar autopilot could be made as accurate as a lidar system through software. He need to rethink that position.
 
Musk believed that Tesla's radar autopilot could be made as accurate as a lidar system through software. He need to rethink that position.

Perhaps the autopilot system's radar has a limited field of view... I can see how an optical only system could fail to see other vehicles and large objects, but radar.

Yet again, I declare that they need more sensor data. They need lots more data to interpret their surroundings. It is unfathomable though, hitting a parked, stationary vehicle.
 
I've come to conclusion about the mindset of Tesla owners that matches exactly that of Apple customers.

I think Apple should by Tesla, to stockpile all its faithful undiscourageable idi0ts.
I'm sorry, but why are you blaming them for what is "allegedly" human stupidity? We still don't know yet for this one, but until now it's been user error that caused the past accidents. Who seriously leaves the driver's seat? O_o
Personally, I do not see this as human stupidity.

IMO, as I have said in other threads on this matter, if the autopilot cannot get a human to take control, there is no way that it should disable itself and just keep driving. That is the most asinine thing I have ever heard.

As I see it, what it should do as an ethical matter, is pull off to the side of the road, safely, and refuse to drive further in autopilot mode until the driver takes control. Perhaps it should also refuse to re-enable for the remainder of the trip or for at least a "time out" period of at least 30-minutes since the dolts driving are acting like little children in the first place.

There was an article here on Techspot in that past few weeks that claimed the engineers who worked on the autopilot wanted to have the car do something like that, but it was nixed by Tesla, and perhaps maybe Musk himself, as being too expensive - which Tesla promptly denied as would be expected because denying it is a cover Tesla's a$$ PR move.

To me, people place entirely too much faith in Tesla and Musk, and his supposed "genius" is seriously giving a bad reputation to self-driving vehicles.

As I see it, anyone who does not yet see that Musk is an overgrown, tempter tantrum throwing baby in a man's body is denying reality.
 
I wonder how many human controlled vehicles crashed into a stationary vehicle that day...

Think much? SMH. Let's compare miles driven by human vs by self-wrecking cars. You are so blind to want to put "tech" in at the risk of human safety. Would you walk out in front of one of these things? If not, why do you want to force it on us?

“complete total loss,” Umm, hey buddy. What about "complete" and "total" that you would need both in the same sentence? Maybe I don't understand english very well.
 
I've come to conclusion about the mindset of Tesla owners that matches exactly that of Apple customers.

I think Apple should by Tesla, to stockpile all its faithful undiscourageable idi0ts.
As I see it, your comparison is spot-on:

Steve Jobs about the iPhone antenna fiasco - "You're holding it wrong!"

Elon Musk about these asinine autopilot failures - "You're driving it wrong!"
 
Personally, I do not see this as human stupidity.

IMO, as I have said in other threads on this matter, if the autopilot cannot get a human to take control, there is no way that it should disable itself and just keep driving. That is the most asinine thing I have ever heard.

As I see it, what it should do as an ethical matter, is pull off to the side of the road, safely, and refuse to drive further in autopilot mode until the driver takes control. Perhaps it should also refuse to re-enable for the remainder of the trip or for at least a "time out" period of at least 30-minutes since the dolts driving are acting like little children in the first place.

There was an article here on Techspot in that past few weeks that claimed the engineers who worked on the autopilot wanted to have the car do something like that, but it was nixed by Tesla, and perhaps maybe Musk himself, as being too expensive - which Tesla promptly denied as would be expected because denying it is a cover Tesla's a$$ PR move.

To me, people place entirely too much faith in Tesla and Musk, and his supposed "genius" is seriously giving a bad reputation to self-driving vehicles.

As I see it, anyone who does not yet see that Musk is an overgrown, tempter tantrum throwing baby in a man's body is denying reality.
It does come to a stop if forced to relinquish control because of human ignorance. During the time that Autopilot has taken control, how many accidents has it been involved in. Even little minor ones? Now, during that same time period, how many people killed other people while driving? How many fender benders? How many dead animals? How many dead children? Let me tell you. One you can count on one or two hands. The other you can’t count at all. Neither can the dead. You need to step back and look at the big picture here. It’s a picture that could very likely still have your mom, your dad, your brother, your sister, your pet. Still in it because they didn’t get killed by a human driver not paying attention.
 
Sue Tesla into the ground until they stop marketing their system as an Autopilot and issue a recall of all vehicles to disable the feature.
Thankfully there are still very few of these vehicles on the roads. Can you imagine if the car was affordable and there were millions of them out there at any one time? It would be a slaughter.
 
It does come to a stop if forced to relinquish control because of human ignorance. During the time that Autopilot has taken control, how many accidents has it been involved in. Even little minor ones? Now, during that same time period, how many people killed other people while driving? How many fender benders? How many dead animals? How many dead children? Let me tell you. One you can count on one or two hands. The other you can’t count at all. Neither can the dead. You need to step back and look at the big picture here. It’s a picture that could very likely still have your mom, your dad, your brother, your sister, your pet. Still in it because they didn’t get killed by a human driver not paying attention.
Yes, it comes to a stop. By crashing. Yes, that is technically coming to a stop.

As I see it, your argument is a straw man.
 
I wonder how many human controlled vehicles crashed into a stationary vehicle that day...

It is much much worse when an AutoPilot automatically pilots itself into a stationary vehicle. Because the AutoPilots are neither being punished, nor taken off the driver pool nor being corrected or retrained as a human would. Elon Musk, creator of the AutoPilot, is not being punished but if you would be, if you were 'piloting' this vehicle.

Each of the humans who crashed into stationary vehicles that day will be punished in their own way. Some will be removed from the driver pool. Some would have removed themselves by dying.

However, the bug in the software that caused this Tesla to drive into a stationary vehicle will not be fixed by Tesla. None of the previous bugs causing the deaths and crashes have been fixed either. So, every single Tesla still on the road has all the same bugs, ready to kill or crash into stationary vehicles.

Or, if there are 150,000 Teslas with this software on the road, there are 150,000 (-1) bad drivers who crash into stationary vehicles, who have not been punished, corrected or modified in any way.

Hope that helps you understand the difference between a human (like yourself) crashing into a stationary vehicle vs Elon Musk's AutoPilot piloting itself into stationary vehicles.
 
So shouldn't the car turn pull over once someone leaves the driver seat? Why allow it to continue when they aren't there, whether or not their hands are on the wheel?

Society is getting lazy. This is proof positive of this.
 
So shouldn't the car turn pull over once someone leaves the driver seat? Why allow it to continue when they aren't there, whether or not their hands are on the wheel?

Society is getting lazy. This is proof positive of this.
This is exactly my point. If the car is supposed to pull over, what is happening when the car has not made the decision to pull off at times like this? And just running into things like a fire truck at 65 MPH? As I see it, there is a major problem with Tesla "autopilot."

What would happen if someone had a seizure or something? Though they may be rare, there are times when people become incapacitated through no fault of their own. If the car cannot handle these times safely, then it should not be on the road.
 
I wonder how many human controlled vehicles crashed into a stationary vehicle that day...
Are human driven cars advertised as being safer?

no?

autopilot cars ARE. Thats why, when autopilot screws up and does this, it faces such scrutiny.

After all, these cars have sensors, and cameras, and auto-braking. How are crashes like this STILL happening?

Musk's huburus in naming his driving aids as "autopilot" fully knowing people would misinterpret what that means is going to sink his brand will all this negative press.
 
Tesla is at the point that they need to remove that feature until they can perfect it, or start calling it driver assist instead of autopilot ..... I am surprised that we haven't seen any government action along those lines .... yet.
 
Personally, I do not see this as human stupidity.

IMO, as I have said in other threads on this matter, if the autopilot cannot get a human to take control, there is no way that it should disable itself and just keep driving. That is the most asinine thing I have ever heard.

As I see it, what it should do as an ethical matter, is pull off to the side of the road, safely, and refuse to drive further in autopilot mode until the driver takes control. Perhaps it should also refuse to re-enable for the remainder of the trip or for at least a "time out" period of at least 30-minutes since the dolts driving are acting like little children in the first place.

There was an article here on Techspot in that past few weeks that claimed the engineers who worked on the autopilot wanted to have the car do something like that, but it was nixed by Tesla, and perhaps maybe Musk himself, as being too expensive - which Tesla promptly denied as would be expected because denying it is a cover Tesla's a$$ PR move.

To me, people place entirely too much faith in Tesla and Musk, and his supposed "genius" is seriously giving a bad reputation to self-driving vehicles.

As I see it, anyone who does not yet see that Musk is an overgrown, tempter tantrum throwing baby in a man's body is denying reality.
It does come to a stop if forced to relinquish control because of human ignorance. During the time that Autopilot has taken control, how many accidents has it been involved in. Even little minor ones? Now, during that same time period, how many people killed other people while driving? How many fender benders? How many dead animals? How many dead children? Let me tell you. One you can count on one or two hands. The other you can’t count at all. Neither can the dead. You need to step back and look at the big picture here. It’s a picture that could very likely still have your mom, your dad, your brother, your sister, your pet. Still in it because they didn’t get killed by a human driver not paying attention.

Exactly. People are looking at these anecdotes (which grab headlines) and go "See! I told you it was bad!"

Okay. Bad compared to what? Compared to human error, even when controlling for drivers, vehicles, conditions, etc, the autopilot, _even when used incorrectly_ , is vastly safer.

This is akin to the airplane/car safety risks everyone learns in middle school. Planes are much safer but people fear them because they think of specific plane crashes.
 
Are human driven cars advertised as being safer?

no?

autopilot cars ARE. Thats why, when autopilot screws up and does this, it faces such scrutiny.

After all, these cars have sensors, and cameras, and auto-braking. How are crashes like this STILL happening?

Musk's huburus in naming his driving aids as "autopilot" fully knowing people would misinterpret what that means is going to sink his brand will all this negative press.

No one has a right to drive. This is one reason why inspections and licences are required. By having autopilot and human controlled cars equally valid under law, it creates as false impression that human driven cars are just as safe. They are not.

This is also a huge reason why we are debating this. While autopilot *is* safer (at least if you care about statistics -- even when normalized), it is also regulated. At any moment the public backlash for isolated events could drastically change a new industry. Something the legacy cars would prefer (because that means less competition).

I'm not afraid of this so much at the regulatory level (say, the state DMV) but moreso at the legislative level where an amendment could be slipped into a major bill to change the industry without careful review.
 
Here is why autopilot will never become super common... Us humans cause accidents all the time, an average of 12,250 per day as of 2015. A single autopilot vehicle makes a mistake and OMG THEY'RE OUT TO KILL US!

Autopilot will never be perfect but will always be better than humans, but that's not good enough.
 
Back