California DMV suspends GM Cruise's permit to operate its driverless taxis

midian182

Posts: 9,745   +121
Staff member
What just happened? Following several accidents involving the cars, including a recent incident in which a pedestrian was dragged and trapped under the wheels of a self-driving vehicle, California has suspended Cruise's fleet of driverless taxis from the state's roads. The agency has also accused Cruise of misrepresenting the safety of its vehicles.

The California DMV said General Motors-owned Cruise vehicles were "not safe for the public's operation" and that it was revoking its permits to test and deploy vehicles on public streets.

"When there is an unreasonable risk to public safety, the DMV can immediately suspend or revoke permits," the regulator said, adding that the company can still operate its vehicles as long as a safety driver is present.

The DMV also said that Cruise had "misrepresented any information related to safety of the autonomous technology of its vehicles."

Earlier this month, a Cruise vehicle was involved in an incident that saw a San Francisco pedestrian hit by two cars. According to the company, the woman was struck by a human-driven car, hurling her in front of a driverless taxi that ran her over and stopped with its rear tire still on her leg.

The DMV said it is still reviewing the latest incident. The agency revealed that the Cruise vehicle attempted a pullover maneuver after it stopped while the pedestrian was still under the wheels, dragging her another 20 feet. The DMV said Cruise did not reveal the pullover maneuver in its meeting with California Highway Patrol the day after the accident, providing only the on-board camera footage that showed the initial collision.

Cruise was first granted a permit to operate paid robotaxi rides in San Francisco in June last year. In August, the California Public Utilities Commission gave its approval for autonomous vehicle companies to expand robotaxis' hours of operation to 24/7 while charging for rides.

San Francisco officials in June said that since January, there had been ninety incidents involving Alphabet's Waymo, which keeps its permit to test driverless vehicles, and Cruise robotaxis. Incidents include cars hitting a bus, causing massive traffic jams, and running over a dog. It led to protests against autonomous vehicles in July that involved placing traffic cones on their hoods to disable the cars.

Cruise agreed to cut its fleet of San Francisco robotaxis in half in August at the request of the DMV after one of its vehicles collided with a fire truck. Elsewhere, the National Highway Traffic Safety Administration opened a safety investigation into the company last week.

Permalink to story.

 
Lol; filevthis under human stupidity. Another great example of fake "AI" that has no self awareness orbadaptively creative decision or situational wareness ability.
 
This whole driverless cars thing showing itself yet again to be dumb, expecting it to drive around so many factors that it cannot measure and adapt to is impossible (I.e. even simple things like grip), so it will always be inherently flawed and single minded, hence the accidents where the system went "um" and its human tender didn't realise until its too late, considering these are meant to drive completely independently eventually (or so they say), currently its not proving that way at all, and if someone has to be at the wheel and aware, then its pointless bar an element of laziness, radar cruise control yes, but this is just dumb and unneccessary (byt of course business will follow it as another revenue stream no matter what)
 
So, they're going to ban human drivers too on the same basis of unreasonable risk to public safety, right? Right?
 
The really wild part about this situation is that the accident that led to it involved a human driver hitting a pedestrian (on a crosswalk I believe, which at this time is being treated as a hit an run) which propelled the pedestrian into the path of the Cruise autonomous vehicle. The robotaxi panic stopped but it was too close to the pedestrian to complete the maneuver and it stopped on top of the pedestrian after which the robotaxi attempted to perform a pull to curb maneuver with the pedestrian still under the car dragging them around 20ft in the process. Currently there is a contention within the review board that Cruise may have attempted to crop the vehicles onboard video to not show the events after the vehicle stopped, thereby trying to cover up the robotaxis post accident pull to curb maneuver. This situation is getting messy extremely quickly.

As far as I'm aware the pedestrian is alive and being treated.
 
Like with many new things, it will take time to update all the varitables and create a dependable device. After all, how many years did cars drive around without traffic signs, lights, etc. and accidents back then were rampant. The stat's clearly indicate that these driverless vehicles are safer and more dependable than human operated vehicles and when you throw in the number of distracted drivers there is a very wide spread in safety, in favor of the driverless machines. Sadly, too many people equate these vehicles with a loss of their freedom; just not the same thing.
 
So, they're going to ban human drivers too on the same basis of unreasonable risk to public safety, right? Right?
Terrible human drivers do get their license revoked too. That hit and run guy probably not only would lose their license, but also be facing other penalties, once caught. Cruise is unlikely to suffer beyond losing license temporarily.

There are terrible human drivers that shouldn't have a license, and a lot of passable ones where the benefit of giving them a license outweighs the risk. Each human is different, so different decisions regarding their license can be made. To me, the really interesting part is that given the autonomous fleet is running same piece of software, should we consider it a single driver? If so, should a single offense that could have revoked a human license revoke the license for the entire fleet? Or should the threshold be different and if so, how do we set that? That's a new legal/moral question we didn't have for human drivers.
 
There are terrible human drivers that shouldn't have a license, and a lot of passable ones where the benefit of giving them a license outweighs the risk. Each human is different, so different decisions regarding their license can be made. To me, the really interesting part is that given the autonomous fleet is running same piece of software, should we consider it a single driver? If so, should a single offense that could have revoked a human license revoke the license for the entire fleet? Or should the threshold be different and if so, how do we set that? That's a new legal/moral question we didn't have for human drivers.
Eh, I think it would be pretty excessive to treat an entire fleet like a single driver. The only thing that matters is accident/injury/death rate per mile traveled, and, not having looked it up, I'm willing to bet that that rate is already lower for self-driving cars than it is for humans, despite self-driving cars still having a lot of ways they can improve.

Really, I'd personally rather there not be cars at all, but I'm aware that I'm 100 years too late for that ideal.
 
Back