Many people think driver assistance features make a car fully autonomous

midian182

Posts: 9,741   +121
Staff member
Facepalm: Despite auto manufacturers repeatedly reminding customers that a 'driver assistance system' does not mean a vehicle is capable of full self-driving, many owners treat them this way. Putting too much faith in the likes of Tesla's Autopilot can be dangerous, especially as users often perform actions like eating or texting when they're activated.

The findings come from a study by the Insurance Institute for Highway Safety (IIHS), a nonprofit organization funded by auto insurance companies that aims to reduce vehicle-related deaths and injuries.

The IIHS study, which covered 600 active users, found that 53% of Super Cruise (General Motors), 42% of Autopilot (Tesla), and 12% of ProPILOT Assist (Nissan) owners said that they were comfortable treating their vehicles as fully self-driving. Some added that they were happy to let the cars drive themselves in inclement weather and parking lots.

Tesla has always said that users of its Autopilot feature must be attentive and keep both hands on the wheel while it's activated. Like Super Cruise, it can lock out owners who aren't paying attention. About 40% of the study's participants admitted that the systems had shut them out at some point while driving and would not reactivate.

"The big-picture message here is that the early adopters of these systems still have a poor understanding of the technology's limits," said IIHS president David Harkey. "But we also see clear differences among the three owner populations. It's possible that system design and marketing are adding to these misconceptions."

The IIHS notes that most driver assistance systems are comprised of two elements: Adaptive cruise control that keeps vehicles moving at a certain speed while maintaining a set distance from the car ahead, and lane centering to keep the vehicles in the middle of a road. Some systems are also capable of performing lane changes and other advanced maneuvers.

GM's Super Cruise can be used on 400,000 miles of North American roads. Following a crash in San Francisco, the company issued a safety recall report last month for software that governs how its vehicles act when making an unprotected left turn.

The study found that Super Cruise users were likelier to report looking away from the road for extended periods than those using other manufacturers' systems. They are also the most likely to consider unsafe driving activities safe when the assistance feature is activated.

Tesla has faced plenty of criticism for calling its system Autopilot, which suggests it is more capable of self-driving than it is. ProPILOT Assist, in contrast, suggests it's more of a driver's aid.

As noted by Reuters, the NHTSA has opened 37 special investigations involving 18 deaths in crashes involving Tesla vehicles in which systems like Autopilot were suspected of use.

Some of the non-driving activities users were engaged in while the systems were activated included eating, drinking, texting, using a phone, and reading books, magazines, or newspapers.

The IIHS said most Super Cruise and Autopilot owners were male, while both sexes were more or less equally represented among ProPILOT owners. Moreover, most Super Cruise owners were over 50, while Autopilot owners tended to be younger, with a quarter under 35. ProPILOT owners' ages were evenly distributed.

Permalink to story.

 
This article is opening an interesting debate. It is clear that nowadays so called AI pilot assistants do not make cars fully autonomus. And Tesla is consciouncly and intentionally misleading it's custumors with it's AutoPilot name as this article pointed out. They should be make accountable for that.
A lot of people died because of that unfortunately. Even so they helped us understand to not rely on full autonomus capabilities of driving cars.
But will be reliable in the future?
I think that in near future definitely not and will not be possible at all. Why? Because a fully autonomus cars will interact with other human participants in traffic which are complex and imprevisible factors. These imprevisible human factors may not be codded in AI software.
So it is time for this car industry to better regulate how AI assist and help in driving, to bring some mandatory guide lines and not playing with words like AutoPilot, AI, just to increase sales.
 
It's possible that system design and marketing are adding to these misconceptions.
I wonder why this guy would ever think that? Could it be since Musk/Tesla calls it "autopilot" in all their marketing literature for the "feature"?

The guy is an absolute genius. He may even be smarter than Musk.
:rolleyes:
 
I guess I'm old school. I'm one of those that would never use this "auto pilot" garbage.
I actually enjoy driving. As long as I have my tunes playing, I'm good.
I don't even use the cruise control.
 
Reminds me of that dumb fool who thought that putting a Winnebago on "Cruise Control" was the same as putting an A220 on autopilot.

Good Grief!
 
I guess I'm old school. I'm one of those that would never use this "auto pilot" garbage.
I actually enjoy driving. As long as I have my tunes playing, I'm good.
I don't even use the cruise control.
Personally, I'm looking forward to getting radar assisted down to 0 MPH cruise control on my new vehicle.
 
I think many years ago there was a case of a lawsuit by an ******* who thought that speed hold assistance was an autopilot. so he activated it and went behind (it was a mobile home)... there is no need to tell what happened in the first curve of the road.
The worst thing about it is that I won the case. since then the manufacturer made the manuals foolproof.
 
Back