Uber won't reapply for permit to test autonomous cars in California

midian182

Posts: 9,726   +121
Staff member

Following the fatal crash on March 18, Uber took all its self-driving cars off the roads in the four cities they operate. Now, it’s been revealed that the company won’t be renewing its autonomous vehicle permit in California, which expires on March 31.

In a letter from the state’s Department of Motor Vehicles to Uber, DMV deputy director and chief counsel Brian Soublet wrote: “Uber has indicated that it will not renew its current permit to test autonomous vehicles in California.”

Uber’s cars will no longer be able to operate on public roads in the state once the current permit expires next week. The company said it does not know when it will reapply for permission to test the vehicles.

“We proactively suspended our self-driving operations, including in California, immediately following the Tempe incident,” an Uber spokesperson said, in a statement. “Given this, we decided not to reapply for California DMV permit with the understanding that our self-driving vehicles would not operate on public roads in the immediate future.”

If Uber does apply for a new permit, it will need to “address any follow-up analysis or investigations from the recent crash in Arizona and may also require a meeting with the department.”

Tempe, Arizona, police released a video of the accident last week, in which 49-year-old Elaine Herzberg was struck by an autonomous Uber Volvo as she pushed her bicycle across the road. She later died in hospital, becoming the first person to be killed by a self-driving car. The clip, which includes interior and exterior views, appears to have raised more questions than it answered.

The accident has elicited responses from other self-driving companies. Intel—the owner of Mobileye—said its software would have detected and classified Herzberg one second before impact.

Both Toyota and Nvidia have also decided to temporarily halt their autonomous driving tests on public roads, while Arizona has blocked Uber from testing the vehicles on roads within the state.

“The accident was tragic. It’s a reminder of how difficult SDC technology is and that it needs to be approached with extreme caution and the best safety technologies,” a Nvidia spokesperson said. “This tragedy is exactly why we’ve committed ourselves to perfecting this life-saving technology. Ultimately AVs will be far safer than human drivers, so this important work needs to continue. We are temporarily suspending the testing of our self-driving cars on public roads to learn from the Uber incident. Our global fleet of manually driven data collection vehicles continue to operate.”

Permalink to story.

 
I like how the Uber spokesman claimed that their reaction to the death in Arizona was to say “We proactively suspended our self-driving operations.." They can't even lie very well. A reaction to an event is not being proactive. What a slimy company.
 
If someone walks in front of a train do you blame the train? Most of what I gather from people who have seen the video is "I would not have been able to stop in time if someone walked in front of my car like that"

This is a major set back for self driving cars and considering how many bad drivers I see on a daily basis this is highly regrettable.
 
Even Intel, which is a rival company, said their "advanced" sensors would only have detected the woman 1 second in advance - which would NOT have been enough time to stop given that the car was going 35 miles an hour...

Again, if you illegally cross a busy street at night with dark clothing, you are taking your life in your hands.
 
Self Driving cars were supposed to keep this from happening. They don't work
They might... in the future... but already, they're as safe or safer than a car with a driver... This is the FIRST person to die from a driverless car... whereas how many people have died in the same amount of time from cars WITH drivers? I'll give you a hint, it's a LOT more than 1!
 
From an article in this mornings WSJ, the sensors were working as designed so the issue appears to be in the software. They need to keep hammering away; sooner or later this will be the wave of the future, especially for the elderly, handicapped, and alcohol/drug inhibited. It's not for everyone but it could certainly find a very useful place in society ..... hey, can you see a autonomous rickshaw in the not too distant future!?!
 
Self Driving cars were supposed to keep this from happening. They don't work

That is a very simple minded and ignorant statement right there. If you basically throw yourself in front of (whatever) moving at a high rate of speed nothing can stop that vehicle fast enough, detection or not. That is not a software/sensor issue, it's a stupid human issue. The only question that needs answering is why did the lidar system not detect the person any sooner? Were they behind a car, tree or something? If that's the case then you are partially correct, why did it not work in this instance and did in others?
 
When I saw the dash video, I firmly believe that there is no normal person would be able to stop in time and not hit the lady. There was an article who pointed it out that it was around 1.5 secs where you see her foot and hitting her. I will still say It was the fault of the lady casually walking in the middle of the dark road why this happened to her. We should treat autonomous cars as normal cars. Walking in front of a moving experimental car should not be an excuse.

It all changed when they released the video of the car driver. You gotta see the video. Man, why did they hire a driver like her. Her eyes was not on the road considering this is still a test drive. Clearly she was distracted (female driver). She did not do her job properly. Easy money huh? Now, If only the driver of the car acted as a responsible and defensive driver, the focus could have been the actions of the lady with the bike before the accident. In this accident, I will say its the car driver's fault.

Edit: Links.
http://canada.autonews.com/article/...ar-video-shows-operator-was-distracted-before

EXTRA:
https://mashable.com/2018/03/26/uber-driver-staircase-blames-gps/
 
Last edited:
As mentioned earlier Intel has fed the video released by Uber into the collision avoidance system installed in the Volvo XC90 that was DISABLED by Uber.
They say that the system detected the person 1 second before the collision, surely not enough for a complete stop but a reduction in speed can net the difference from a fatality and a slight bruising!
Then there is the fact that we don't know what Uber has done to this video before releasing it.
I found it quite strange that a modern sensor would have such a dark view in a street lit by streetlights.
Well, lets' just say we have probably not seen all there is to see here yet...
https://www.youtube.com/watch?v=QCCmqosHT-o
 
REVOLUTIONARY NEW TECHNIQUE COULD SAVE LIVES:
BREAKING NEWS! Pedestrian lives are spared when pedestrians LOOK BOTH WAYS BEFORE CROSSING A ROAD! STOP THE PRESSES!

My 7 year old knows this rule so why is this a hard concept for others to understand? If you care at all about your own safety or the safety of others, you'll be more inclined to make it to see old age. How is the autonomous car to blame for ***** syndrome?
 
@Kenrick, it was a male 'driver'/operator.
This pedestrian could have been identified by already available technology, yet this vehicle, self driving at a fair rate of speed, loaded with multiple technologies to identify everything around it and avoid collisions did not react in any way to a pedestrian. An extra second could have made all the difference. If the operator had actually been looking that might have helped as well.
 
Self Driving cars were supposed to keep this from happening. They don't work
They might... in the future... but already, they're as safe or safer than a car with a driver... This is the FIRST person to die from a driverless car... whereas how many people have died in the same amount of time from cars WITH drivers? I'll give you a hint, it's a LOT more than 1!

I don't see how you can think these are safe when these things have only driven a fraction of a fraction of a fraction of the miles on the worlds' daily use. Given that they don't drive in any adverse conditions, and their "safety" drivers have to intervene repeatedly, all these things are more dangerous. Given that they limit them to such slow speeds also contributes why there aren't as many deaths.

https://arstechnica.com/cars/2018/0...-self-driving-car-program-years-behind-waymo/

There was another article a week or so ago that gave the statistics on this. According to the mileage driven of all these self-driving cars and all the deaths reported, these cars are 95-98% MORE likely to kill someone than a human driver. Sorry I can't find the article any more. How many people you know in your entire life have you heard that killed someone with their car?

Raise your hand if you would like to be the next victim to test these by trying to walk across the road in front of them...
 
@Kenrick, it was a male 'driver'/operator.
This pedestrian could have been identified by already available technology, yet this vehicle, self driving at a fair rate of speed, loaded with multiple technologies to identify everything around it and avoid collisions did not react in any way to a pedestrian. An extra second could have made all the difference. If the operator had actually been looking that might have helped as well.

lol. its a female operator. look at my link. may look like a guy to you but thats a SHE. There is a name of her floating around and even a police background. haha.
 
@Kenrick, it was a male 'driver'/operator.
This pedestrian could have been identified by already available technology, yet this vehicle, self driving at a fair rate of speed, loaded with multiple technologies to identify everything around it and avoid collisions did not react in any way to a pedestrian. An extra second could have made all the difference. If the operator had actually been looking that might have helped as well.

http://www.dailymail.co.uk/news/art...-pedestrian-killed-self-driving-Uber-car.html
 
Imagine if this happened after the first human operated automobile fatality.
What is to imagine, the fact that they were more mindful of their surroundings. Or the fact that they did have a false sense of safer driving by other means. Yeah people were being trampled by horses, just as they are by cars now. Yet for some reason it is being frowned upon when the vehicle is self driven.

There is an easy solution, don't walk (someone didn't like it when I used the word jump and took it literal *I'm still rolling my eyes*) in front of a moving object. Just because the vehicle has breaks, doesn't mean it will stop. Anyone with arrogance thinking the vehicle will stop for them outside of a pedestrian walkway deserves to be run over.
 
Self Driving cars were supposed to keep this from happening. They don't work

That is a very simple minded and ignorant statement right there. If you basically throw yourself in front of (whatever) moving at a high rate of speed nothing can stop that vehicle fast enough, detection or not. That is not a software/sensor issue, it's a stupid human issue. The only question that needs answering is why did the lidar system not detect the person any sooner? Were they behind a car, tree or something? If that's the case then you are partially correct, why did it not work in this instance and did in others?
No this is software / sensor issue. We have the technology to detect objects on the roadway without the use of visible light. Clearly, Uber is not using this tech. Some, including yourself, may say it is not necessary. I on the other hand beg to differ.

Now the lady killed was jaywalking, and clearly had time to see the vehicle coming, but it is clear she didn't look before, or while she was crossing the road. This was her fault.

But that doesn't mean that we shouldn't be using all the tech we can to make these autonomous vehicles as safe as possible. That, is what I'm getting at here.
 
No this is software / sensor issue. We have the technology to detect objects on the roadway without the use of visible light. Clearly, Uber is not using this tech. Some, including yourself, may say it is not necessary. I on the other hand beg to differ.

Now the lady killed was jaywalking, and clearly had time to see the vehicle coming, but it is clear she didn't look before, or while she was crossing the road. This was her fault.

But that doesn't mean that we shouldn't be using all the tech we can to make these autonomous vehicles as safe as possible. That, is what I'm getting at here.

There are ALWAYS objects on the side of the road... like people on the sidewalk... the problem is, how does one anticipate that one of these objects is going to fling itself in front of you? Well, if it's at an intersection or crosswalk, you should be mindful.... but if it's dark, and it's NOT an intersection or crosswalk, then alas, the object might very well become a squished object....

I'm not saying it isn't tragic - any time there is a fatality, it is tragic! But sometimes, people being unintelligent is the problem!

Trains are a perfect example of this.... they only travel in one direction - straight along tracks - yet people still manage to die in front of them every year... some are intentional (suicide), but some are people being dumb...
 
Leave it to Sintel to try to profit from this. :mad: "Our system would have detected this." I smell something really disgusting.

As I see it, this should not stop development of self-driving cars. To me, a possible solution is situational awareness - which will not at all be an easy task even if they have significant processing and/or senor power. In other words, monitoring not only what is in front of the vehicle, but, at least in part, to the sides of the vehicle, too. That said, there are plenty of things that could block such monitoring and prevent detection of anything in a situation like this.
 
Last edited:
What is to imagine, the fact that they were more mindful of their surroundings. Or the fact that they did have a false sense of safer driving by other means. Yeah people were being trampled by horses, just as they are by cars now. Yet for some reason it is being frowned upon when the vehicle is self driven.

There is an easy solution, don't walk (someone didn't like it when I used the word jump and took it literal *I'm still rolling my eyes*) in front of a moving object. Just because the vehicle has breaks, doesn't mean it will stop. Anyone with arrogance thinking the vehicle will stop for them outside of a pedestrian walkway deserves to be run over.

What I see a lot of people fail to comprehend is that she would have been killed even in a crosswalk. Are you saying they code these things to only stop for people who are legally crossing the street? SMH. What if it was a child that was still learning to look both ways? I guess should just shrug our shoulders like so many are at this woman who was killed by a cold machine? This whole thing is just a single scenario in the billions of daily obstacles across the world. You can't program for everything. And then there are always hardware/software failures.
 
[QUOTE="
What I see a lot of people fail to comprehend is that she would have been killed even in a crosswalk. .
Um, what gives you that idea? Had she been at the crosswalk, pressed the button, waited for cars to stop and then crossed (which is what EVERYONE is supposed to do!!!), she'd have been perfectly fine!
 
Under that scenario, I would then ask where the child's guardian is at. No one should be in the road unless they know how to look both ways, not even in a residential location.

I take it you do not have nor have ever seen children playing? Like if a ball gets kicked out across the street and they run after it? Or even at a store? Or anywhere? They start running at a moments notice. Unless you want to pass a law that says all kids need to be chained to a parent at all times, then they will continue. If you really think so, chains wear out and break, and kids will find a way to wiggle out of them. What is your plan to prevent kids from running? I guess in this case, it was fine for a car to just run over a person because they were jaywalking?? It also would have left the scene of the murder... it did not even stop after running her over until the "safety" driver stopped it manually.
 
Back