NHTSA report says driver in fatal Tesla Autopilot crash had seven seconds to take action

Ohhh, the natural selection. Only bad thing about it, those Darwin chickens will harm thinking and responsible people along their "BRILLIANT" acts. But that's sadly normal. Technology will show the extreme stupidity in individuals, just look at the 60% of the internet :D. Every 2nd person gives advices to fortune 500 :D. On the comedy...
 
I'm sorry... but BS. If we are supposed to have these "auto pilot" capabilities, or autonomous vehicles, they should absolutely be able to detect a vehicle crossing paths with the course of the Tesla. Sure sure Tesla has warnings that the driver should always watch the road even when "auto pilot" is engaged and they driver should always be ready to take control... so the driver is at fault in this case, but that doesn't mean that this isn't a safety related issue with the Tesla auto pilot system. It probably doesn't need to be investigated anymore, because they know the Tesla computers never detected the semi tractor trailer. Tesla needs (and hopefully already is) to find a way to prevent this in the future. If we are to have any faith at all in the future of autonomous vehicles.

To finalize my opinion, the driver AND Tesla are at fault here. The driver should have been doing what the Tesla instructions said to do, and that is to watch the road and be ready to take control. AND, Tesla should acknowledge this was a fault in their system design and it needs to be corrected.

/rant

So your rant is that it's the driver and the assisted driver technology at fault here? So basically you're not really saying anything at all and you're all worked up about it.

Driverless cars are coming. The technology is incubating right now and manufacturers all over the place are racing to market with this technology. The technology is sound and will be an order of magnitude safer than human drivers. The Wall Street Journal reported that driverless cars will eliminated more than 90% of traffic fatalities in the first 50 years for the drivers of these vehicles. The net result of driverless car technology is that the roads will be many times more safer.

Yet you sit and rant about sht you don't even really understand.

That's just the tip of the iceberg. Aside from not having to worry about people driving impaired (drunk, emotionally, or otherwise) it also will demolish the cost of insurance and could save certain people allot of money. When we do go all driver-less, if car insurance still exists it will be much cheaper than what we pay now. People who previously needed to own a car and pay insurance on it can simply open up an app and hail a driver-less car to take them anywhere. The app can easily have a feature where it picks you up on a schedule everyday in the vehicle of your choice.

This also means that anyone in the family can get a ride, no license required. No longer are people restricted by their vehicles.

It is interesting that you bring up car insurance.

As an aside, it is the car insurance companies that are fighting driverless technology. Its not a question of "We want more money" - even at lower premiums, with far fewer accidents their profits would probably still go up - but a question of liability.

What happens when a completely autonomous vehicle suffers a mechanical failure while moving and gets in an accident? Who is responsible? Is it the manufacturer, who didn't design the autopilot well enough to detect this particular case of mechanical wear beforehand and have the car drive itself to a mechanic and back to get fixed? Or the driver, who 'didn't maintain the car well enough' to prevent the failure in the first place?

Simply put: who owns the insurance policy? The driver/owner or the manufacturer? The manufacturers would prefer that they take it over for more complex legal reasons (they are taking on additional responsibility by 'programming the driver', it makes sense they should be the ones to hold some kind of insurance, and double-insured is waste) and as an additional selling point ("You don't need insurance with this car!"), and insurance companies want the drivers to maintain it.
 
It is interesting that you bring up car insurance.

As an aside, it is the car insurance companies that are fighting driverless technology. Its not a question of "We want more money" - even at lower premiums, with far fewer accidents their profits would probably still go up - but a question of liability.

What happens when a completely autonomous vehicle suffers a mechanical failure while moving and gets in an accident? Who is responsible? Is it the manufacturer, who didn't design the autopilot well enough to detect this particular case of mechanical wear beforehand and have the car drive itself to a mechanic and back to get fixed? Or the driver, who 'didn't maintain the car well enough' to prevent the failure in the first place?

Simply put: who owns the insurance policy? The driver/owner or the manufacturer? The manufacturers would prefer that they take it over for more complex legal reasons (they are taking on additional responsibility by 'programming the driver', it makes sense they should be the ones to hold some kind of insurance, and double-insured is waste) and as an additional selling point ("You don't need insurance with this car!"), and insurance companies want the drivers to maintain it.

The great thing about allot of these autonomous cars is they have allot of sensors and record allot of data. I'm certain that car manufactures and insurers will come together to create standard procedures to avoid any uncertainty. If anything, who is at fault will only become clearer. If there was a problem on the car's side, the data will show it. On the driver's side? The data will show it. ect, ect, ect. There may be cases that slip through the crack but no system is perfect. We need only remember that it is far better than what we had before.
 
The great thing about allot of these autonomous cars is they have allot of sensors and record allot of data. I'm certain that car manufactures and insurers will come together to create standard procedures to avoid any uncertainty. If anything, who is at fault will only become clearer. If there was a problem on the car's side, the data will show it. On the driver's side? The data will show it. ect, ect, ect. There may be cases that slip through the crack but no system is perfect. We need only remember that it is far better than what we had before.

Lets remember that we're talking about bureaucrats and lobbyists; facts don't matter, money does.
 
Sounds like assisted suicide the driver didn't have the guts to drive into something himself so opted for the Auto pilot option and increased the cruise control to 74Mph just to make sure he'd die
 
Sounds like assisted suicide the driver didn't have the guts to drive into something himself so opted for the Auto pilot option and increased the cruise control to 74Mph just to make sure he'd die

Concur. You can't fix stupid!
 
I wasn't trolling but you think what you want
It literally has the highest safety ratings possible - off the scale, considering that the car broke the machine meant to break it. Without an engine, the car is lighter and the entire hood is one giant crumple zone/trunk.

Anyone who bought one knows this, and even those who don't own one are aware of this (it made the national news).

So, again, troll harder.
 
It literally has the highest safety ratings possible - off the scale, considering that the car broke the machine meant to break it. Without an engine, the car is lighter and the entire hood is one giant crumple zone/trunk.

Anyone who bought one knows this, and even those who don't own one are aware of this (it made the national news).

So, again, troll harder.


Well It didn't make the news here I live in New Zealand not the US and testing is all well and good, but when someone chooses to ignore the current speed limit (remember they increased the speed to 74MPH) and none of the safety features worked and the driver did nothing in the seven seconds before hand to avoid the crash leads me to believe the person driving (and I use that term loosely) wanted to die
 
Well It didn't make the news here I live in New Zealand not the US and testing is all well and good, but when someone chooses to ignore the current speed limit (remember they increased the speed to 74MPH) and none of the safety features worked and the driver did nothing in the seven seconds before hand to avoid the crash leads me to believe the person driving (and I use that term loosely) wanted to die

The posted speed limit on the road that the accident occurred was 65mph.
https://www.google.com/search?q=Us+...d=ssl#safe=active&q=us+27+florida+speed+limit

In the US, you don't risk getting a speeding ticket until you are 10-15mph over the limit - although this depends on the county. Some set up speed traps on their roads, and even 1mph over will get you pulled over; they use it to supplement the municipal income. These are rare and localized, considering the scale of the US highway network, and usually well known about.

Deliberately going to 74mph on a 65mph road leads me to believe the driver was just setting his car to the maximum speed he knew he could get away with, without needed to worry about a ticket. He probably just then dozed off.
 
Back