Waymo under NHTSA investigation following multiple crashes, reported traffic violations

midian182

Posts: 10,634   +141
Staff member
What just happened? In what will be another blow for robotaxis and the self-driving industry as a whole, Waymo is being investigated by the government highway safety agency following numerous reports of the company's vehicles being involved in crashes or potentially violating traffic laws.

The NHTSA's Office of Defects Investigation (ODI) said it has reports of 17 crashes and five possible traffic violations involving Waymo vehicles equipped with its 5th generation automated driving system (ADS). The agency says the ADS was either engaged in each incident or disengaged in the moments just before they occurred.

The reports include collisions with stationary and semi-stationary objects such as gates, chains, and parked vehicles, and instances in which the ADS appeared to disobey traffic safety control devices such as lights. Some of the collisions occurred shortly after ADS exhibited unexpected behavior near these devices.

While some of the accidents were reported by Waymo to the NHTSA, as required when autonomous vehicles are involved in collisions, publicly available reports highlighted other incidents, including Waymo vehicles driving in opposing lanes with nearby oncoming traffic or entering construction zones.

As reported by The Verge, videos of Waymo vehicles driving erratically have been posted online over the last few weeks. One showed a car driving on the wrong side of the road to avoid a pack of electric unicycle riders.

Just people having a fun time
byu/blazelord69 insanfrancisco

Another video, filmed in Tempe, Arizona, shows a Waymo taking a left turn and driving into oncoming traffic, seemingly to avoid the jam of cars waiting at the lights.

@kilowattsapp Hello officer, sry I got a little confused #waymo #autonomous ⬠original sound - kilowattsapp

Waymo recently analyzed 7.13 million fully driverless miles in Phoenix, Los Angeles, and San Francisco, comparing the results to human drivers. It says the self-driving vehicles were 6.7 times less likely than human drivers to be involved in a crash resulting in an injury, or an 85 percent reduction over the human benchmark, and 2.3 times less likely to be in a police-reported crash, or a 57 percent reduction.

News of the probe is just the latest bit of bad publicity for self-driving and semi-autonomous tech. Waymo rival Cruise paused its operations last year after a woman was struck by a human-driven car, hurling her in front of a driverless taxi that ran her over and stopped with its rear tire still on her leg. The Cruise vehicle attempted a pullover maneuver after it stopped while the pedestrian was still under the wheels, dragging her another 20 feet.

Last month, the NHTSA opened a safety investigation into Ford's BlueCruise "Level 2" driver assistance system following two fatal accidents allegedly involving the feature. There was also an NHTSA report linking Tesla's Autopilot systems to nearly 1,000 crashes from the last few years, over two dozen of them fatal.

Permalink to story:

 
TBF I've seen human drivers do this and so much worse on a remarkably regular basis.

Now all these companies need is one of those human like robots behind the wheel to roll down the window and shout obscenities at everyone else who is driving "wrong".
 
Personally, I don't think most people will accept a driverless system that is any less than a few thousandths or ten-thousandths of a percent perfect. The instance, IMO, in the article where the vehicle turned left into oncoming traffic is particularly egregious and Waymo was lucky that, apparently, no one was hurt or injured.

And it does not surprise me in the least that there have been some 1,000 instances of crashes of Tesla's "Autopilot" POS whether it was the driver's responsibility or not.

I have not tested this on my 2024 Prius Prime (and I don't intent to intentionally test it), but if the vehicle detects my lack of attention for any significant amount of time, the vehicle is supposed to come to a full-stop in-lane. It was implemented by Toyota to cover those instances where the driver may be incapacitated for some reason. Maybe Teslas have this, or something similar, now, but why they have not had this from the get-go is beyond me. To me, the fact that Teslas have not had something like this is all the evidence I need that Musk is not the genius Musk lovers make him out to be. To me, he's only interested in gaining all the money he possibly can no matter the cost to Tesla customers.

As I see it, "self-driving" systems have significant room for improvement before they are totally ready for "prime time."
 
TBF I've seen human drivers do this and so much worse on a remarkably regular basis.

Now all these companies need is one of those human like robots behind the wheel to roll down the window and shout obscenities at everyone else who is driving "wrong".
Oh yes, humans do this. But let's hope that our driverless cars copy the behavior of safe human drivers and not the crazy ones lol.

I have not tested this on my 2024 Prius Prime (and I don't intent to intentionally test it), but if the vehicle detects my lack of attention for any significant amount of time, the vehicle is supposed to come to a full-stop in-lane. It was implemented by Toyota to cover those instances where the driver may be incapacitated for some reason. Maybe Teslas have this, or something similar, now, but why they have not had this from the get-go is beyond me. To me, the fact that Teslas have not had something like this is all the evidence I need that Musk is not the genius Musk lovers make him out to be. To me, he's only interested in gaining all the money he possibly can no matter the cost to Tesla customers.

As I see it, "self-driving" systems have significant room for improvement before they are totally ready for "prime time."
Teslas have had this from the beginning, and furthermore they will prevent you from using autopilot if it detects you're not paying attention. Where you get the information that Teslas do not do this is beyond me... Here are two videos from 7 years ago:

Nowadays it's much more aggressive. Within seconds it starts nagging the driver to pay attention, it locks the driver out of autopilot in 20 seconds, and it comes to a complete stop in 50 seconds. If you keep not paying attention in later drives, it disables the feature for multiple days:
 
Back