Tempe police chief says Uber not likely at fault in fatal self-driving accident with pedestrian

Shawn Knight

Posts: 15,296   +192
Staff member

Police responded to a traffic accident in Tempe, Arizona, on Sunday night where it was discovered that an Uber SUV operating in self-driving mode had collided with a pedestrian. The individual, identified as 49-year-old Elaine Herzberg, was taken to a local hospital where she later died of her injuries.

According to a new report from the San Francisco Chronicle, Herzberg, who was pushing a bicycle encumbered with plastic shopping bags, abruptly walked from a center media into a lane of traffic where she was struck by the SUV. Authorities believe she may have been homeless.

Tempe Police Chief Sylvia Moir relayed to the publication that the driver said it “was like a flash, the person walked out in front of them.” According to the chief, the driver’s first alert to the collision was the sound of the collision.

The self-driving Volvo was traveling at 38 mph in a 35 mph zone and made no attempt to brake according to the preliminary investigation.

The Uber was equipped with at least two video cameras – one facing outward towards the street and another focused on the backup driver inside. Moir said that based on the video captured, “it’s very clear it would have been difficult to avoid this collision in any kind of mode (autonomous or human-driven) based on how she came from the shadows right into the roadway.”

The collision reportedly happened within 100 yards or so of a marked crosswalk. The police chief noted that it’s dangerous to cross roadways in the evening hours, especially when well-illuminated, managed crosswalks are available.

Investigators from the National Highway Traffic Safety Administration and the National Transportation Safety Board continue to look into the incident. For now, Moir said she doesn’t believe Uber would likely be at fault in the accident based on preliminary data but didn’t rule out the potential for charges to be filed against the Uber backup driver.

Police have not yet released videos from the Uber vehicle.

Permalink to story.

 
It takes about 168 feet to stop a car going 40mph. 88 of those feet is the reaction from the driver to see the person in front of them and slam on the break. A computer is always going to have a faster reaction time to stop the vehicle while the amount of feet the car travels once the brakes are pushed are essentially the same for any given speed.
 
It takes about 168 feet to stop a car going 40mph. 88 of those feet is the reaction from the driver to see the person in front of them and slam on the break. A computer is always going to have a faster reaction time to stop the vehicle while the amount of feet the car travels once the brakes are pushed are essentially the same for any given speed.
Your math is off...
 
Last edited:
It takes about 168 feet to stop a car going 40mph. 88 of those feet is the reaction from the driver to see the person in front of them and slam on the break. A computer is always going to have a faster reaction time to stop the vehicle while the amount of feet the car travels once the brakes are pushed are essentially the same for any given speed.
You're math is off...

*Your spelling is off... (couldn't resist)
 
There is talk in the article about video and being able to see the person. There are technologies out there such as collision avoidance radar that do not need to visually see people in order to stop the car. IMO, the tech should not be excused so easily. Perhaps radar would not have been able to stop the car either, but at least it would have made the effort. If autonomous vehicles are ever going to make it in the world, we cannot excuse inferior tech because doing so will end up making the problem worse. They have to accommodate for as many unknowns as possible, otherwise, autonomous vehicles will be ambulance chasing lawyer heaven, and there is the potential for more lives to be lost.
 
Is that vehicle not equipped with a laser scanner? If setup correctly that would see a person crossing the road without an issue. Not only did the onboard computer not detect the person crossing the street, but it didn't notice the hit either. What sort of weak sauce sensor setups is Uber running here? Probably trying to do it on the cheap...
 
If you're trying to understand blame for an incident, and you want to delve as deeply as possible into the truth, then you have to go as deeply as possible into the underlying causality of it. In this case -- why was that person homeless -- when it's totally unnecessary for anyone to have to be in that position, and the roots are the negligence of a society whose values are horribly misguided.

Yes, this is a rant, and it's slightly off topic. But the article mentioned that information, and the fact that the police report apparently saw fit to include it would imply that the woman was regarded differently than would be most other people, and that it was a salient fact in the investigation.


Now, to the matter of the self-driving car. I would imagine that even at this early date the number of accidents per mile would be lower with self-driving vehicles than with human-driven.

But, autonomous vehicles are never going to work in town. Specifically, the vision of streets where every vehicle is autonomous and steering wheel-less and guided by inter-vehicle 5G communication is just a fantasy. Autonomous vehicles will be great on highways and even country roads, but in town there are just too many destination decisions, too many executive, if you will, decisions, that require specifically human insight. Even in my small town, you have to use your savvy of the town, and all your spacial skills, and all the human ways that you relate to terrain, just to find an appropriate parking space.

Where a steering wheel-less car will work is only in situations where there is a dedicated destination on both ends. For instance, a shuttle to pick up a passenger at an airport and take them to a specific hotel. On each end, there's a definite and reserved destination.

A city full of self-driving cars is a mechanized city where car travelers are no longer free to alter course as events transpire. It's a technological utopia/dystopia sort of vision that won't really save lives or time.
 
Is that vehicle not equipped with a laser scanner? If setup correctly that would see a person crossing the road without an issue. Not only did the onboard computer not detect the person crossing the street, but it didn't notice the hit either. What sort of weak sauce sensor setups is Uber running here? Probably trying to do it on the cheap...

Trying to do it on the cheap? The SUV they used is roughly a 40k SUV. Don't you think if they were trying to do it cheap, they might be running like a Honda Accord or something way less expensive? Too many people on here are jumping to conclusions. From how I'm reading it, it's pretty unanimous that the vehicle isn't believed to be at fault based on the circumstances. Also the whole point of having a driver is to watch out for things like this. You can blame the car all you want, but the driver is there to ensure the vehicle is operating safely. Who knows what he or she was doing at the time of the accident. He could have been taking selfies on Snapchat for all anyone knows at this point. Wait for the full report to come out and then blame who or whatever you want.
 
The car didn't even attempt to brake - which means that it did not detect the person! As the article stated, this is most likely because the victim crossed the street abruptly in an illegal location... Sometimes we don't have to look very hard for a cause.... This one looks like the answer is the victim's stupidity.... Still sad - even dumb people don't deserve to die - but stop blaming the tech!

People die every day on the roads - until this incident, all by driven cars.... I'd be shocked if driverless cars don't prove to be WAY safer than driven ones...
 
It's not about how long it has to stop. There are more moves than just stopping. You can swerve and change direction and miss them instead of waiting for your car to slow all the way down.
 
It's not about how long it has to stop. There are more moves than just stopping. You can swerve and change direction and miss them instead of waiting for your car to slow all the way down.
But this way you might risk driver's life.

That's the main ethic problem with autonomous cars. In some situations the algorithm will have to choose... The victim.
Some manufacturers already said they will always protect the driver and I'm not really comfortable with that.
The weakest should be the most priviledged on the road.
 
What are you expecting? For the car to abruptly go against a tree? or change lanes? or go against incoming traffic?... Yeah let's risk a couple more lives in the process of trying to save a suicidal person...

Come on... stop it already it got boring way too fast.
 
What are you expecting? For the car to abruptly go against a tree? or change lanes? or go against incoming traffic?... Yeah let's risk a couple more lives in the process of trying to save a suicidal person...

Come on... stop it already it got boring way too fast.

Yeah, why even bother braking...
 
Last edited:
Some Pedestrians apparently want to die. I have had a Pedestrian damage my car by sprinting between traffic and hitting my car harder than I hit them. Damaging my car of course in the process.

We even have cases of Pedestrians fighting for their lives after getting nailed by a vehicle and still getting jaywalking tickets. Sometimes it's not the drivers fault, more often than people realize.
 
Pedestrians that jaywalks should always be at fault. If you want to risk getting hit to save 30 seconds it takes to find a crosswalk or set of lights on a busy road then it's your fault. For whatever reason losers will walk out and stop traffic 50 feet from an intersection with a set of lights.
 
And now we encounter one of the biggest problems with programming a vehicle to drive itself: accounting for the randomness and potential illogical nature of human behavior.

This may sound callous, but sometimes I think that self-driving cars that can be dangerous to pedestrians could be a good thing. My city is full of pedestrians that are usually one of the following:

A) Actual crazy/stupid darting out into traffic randomly. Why? Reasons
B) Self-absorbed tools with the "I'm more important, the cars will stop" attitude.
C) Inattentive peds, usually with nose pressed to their phones not even looking up when crossing.
D) The most rare of all: actual sensible walkers who look both ways and don't step in front of moving cars.

I'm not sure when most of the general population lost their innate common sense and had their self-preservation instincts dulled. It's has certainly been a progressive degenerative disorder. Perhaps if their lives are at risk, walkers will actually start paying attention and obeying basic traffic rules.

Side note: This would also serve to cull many of the bicyclists that I encounter that demand all cars around them obey the rules of the road, but seldom obey those same rules themselves. Just a thought.
 
Trying to do it on the cheap? The SUV they used is roughly a 40k SUV. Don't you think if they were trying to do it cheap, they might be running like a Honda Accord or something way less expensive? Too many people on here are jumping to conclusions. From how I'm reading it, it's pretty unanimous that the vehicle isn't believed to be at fault based on the circumstances. Also the whole point of having a driver is to watch out for things like this. You can blame the car all you want, but the driver is there to ensure the vehicle is operating safely. Who knows what he or she was doing at the time of the accident. He could have been taking selfies on Snapchat for all anyone knows at this point. Wait for the full report to come out and then blame who or whatever you want.
I'm not blaming the car I'm blaming the autonomous driving computer tech Uber is using. We have the tech to detect objects, people, and animals crossing the road without the need for visible light. There is no reason not to be using this tech, other than it being cost prohibitive. I think though if you are trying to make a 4,000 lb vehicle autonomous, risking the lives of everyone around it, you should be using the absolute best tech you can.

Real time 360 degree horizon adjustable laser scanning, thermal imaging, etc... whatever is needed to ensure the vehicle isn't going to hit a living being or something that can damage the vehicle and cause a crash.

And frankly the only reason they have a backup driver is to cover their own arse. I'm sure if they could get away with it at some technical comfort level they wouldn't even have a backup driver. And from the situation report so far, the driver may have not been able to see this jaywalker in the first place.

Again, we have the tech... why not use it to save lives? And I'll admit I don't actually know what Uber is using, perhaps they are using some of the tech I described or something equally useful but perhaps it failed to notice the pedestrian due to a software related design decision.
 
What if...

What if the car's computer had decided to swerve and instead struck a vehicle to the side sending it spinning out of control and killing 3 people?

What if the car's computer decided to brake hard, causing the truck behind to rear-end the car, causing severe whiplash to the front occupants and killing the 2 children in the back seat? And the woman was still hit but "only" was paralyzed from the waist down.

What if the car's computer detected the pedestrian's movement onto the road and calculated that it was impossible to stop or slow the car in the distance - let's assume the pedestrian was detected stepping out about 5 feet from the car or some 0.5 seconds away - and it decided that a safety threshold had been breached, that no safe options were available to avoid the collision and to just let it happen?

What are the logic steps that the software could have followed?
Because from the article it sounds like it happened so suddenly that nothing was going to save that woman from an impact.

All we can get from this is the opportunity for law enforcement and engineers to analyse all of the sensor data which this car uniquely had and to develop better technology for protecting human life, whether it's outside or inside the vehicle.
 
What if...

What if the car's computer had decided to swerve and instead struck a vehicle to the side sending it spinning out of control and killing 3 people?

What if the car's computer decided to brake hard, causing the truck behind to rear-end the car, causing severe whiplash to the front occupants and killing the 2 children in the back seat? And the woman was still hit but "only" was paralyzed from the waist down.

What if the car's computer detected the pedestrian's movement onto the road and calculated that it was impossible to stop or slow the car in the distance - let's assume the pedestrian was detected stepping out about 5 feet from the car or some 0.5 seconds away - and it decided that a safety threshold had been breached, that no safe options were available to avoid the collision and to just let it happen?

What are the logic steps that the software could have followed?
Because from the article it sounds like it happened so suddenly that nothing was going to save that woman from an impact.

All we can get from this is the opportunity for law enforcement and engineers to analyse all of the sensor data which this car uniquely had and to develop better technology for protecting human life, whether it's outside or inside the vehicle.
Well then we would have Skynet... and the car's computer would eventually decide we are all a disease and it would proceed to kill us all.

On a more serious note, one of the things I learned in race car driving school was you should really avoid swerving to avoid a bump/hit... this actually applies quite well to defensive driving on public roads. Because, with the what-if's you mentioned, a swerve can certainly cause more problems then it prevents. Just go look at some dash cam videos on YouTube and see what happens when some people swerve at highway speeds to avoid something as benign as a car slowly drifting into their lane. They lose control and hit several other cars in the process, creating a huge mess... while often times the person who "caused" the "accident" (the lane drifter) just keeps driving on as if nothing happened. The best thing you can do in most cases is going to be brake hard. Now sure sometimes that is worse but it is much less likely to have a negative outcome compared to swerving.

Now this is going off of human control, a computer controlled vehicle I would hope would be completely aware of its surroundings. So the computer would know if it is safe to swerve or not, and if it did swerve the computer can prevent losing control. We already have electronic stability control to help prevent losing control so the tech exists even in non-autonomous driving cars. Anyway I've gone off on a tangent.

Back to Skynet. We're all doomed.
 
Back