Fatal autonomous Uber crash could have been a result of improper sensor tuning, report...

Polycount

Posts: 3,017   +590
Staff

Uber doesn't exactly have the best record as far as controversies go, and that doesn't seem to be changing anytime soon.

The most recent example of this trend involves a fatal crash that occurred when a self-driving Uber vehicle fatally struck bicyclist Elaine Herzberg in March.

Though the vehicle was driving itself, there was an Uber employee in the driver's seat. Interestingly, dash cam footage revealed the driver was looking down for several seconds prior to the crash.

At the time, however, Tempe's police force said the driver was not likely responsible for the incident. New information has surfaced today that could reinforce that claim.

According to a report from The Information, the blame for the unfortunate accident might lie squarely on Uber's shoulders.

...sources familiar with the situation believe the car's autonomous systems were tuned improperly, leading the vehicle to regard Herzberg as a "false positive..."

The outlet claims sources familiar with the situation believe the car's autonomous systems were tuned improperly, leading the vehicle to regard Herzberg as a "false positive," or an object not worth stopping over.

Examples of a real false positive might include a plastic bag floating in front of the vehicle or a newspaper lying on the road.

This information, if true, will likely not be available to Herzberg's relatives for use against Uber in a court of law -- the company was quick to settle with the family after the incident took place.

Uber initially put their autonomous vehicle testing program on hold following the crash, but they've since restarted it. With that in mind, it's likely Uber has already fixed the sensor tuning issue if it existed at all.

Permalink to story.

 
I've said it several times already here you guys are probably sick of hearing it... but we have additional sensor technology that could be used to help prevent this stuff from happening. Thermal images would have clearly seen that was a person and not a plastic bag or newspaper floating across the street by the wind. Laser scanners would have detected a human sized object in the road. In the end though, all the decisions are being handled by software, built by humans; so it will never be perfect.
 
So, I was right. It was avoidable. Still don't understand how they find the driver at no fault even though they are clearly staring at their phone while their only fuqing job is to drive the car.

Maybe the city's Uber fed politicians should be at fault for not putting stricter rules on Uber's road experiments?
 
Does the vehicle not have redundant systems? While thermal images would be helpful, they would be no less "confused" by other thermal sources around them like auto engines, industrial compressors used for road work, etc, etc. Still, aside from all of this let's not lose sight that the woman was j-walking, wearing dark clothing and in general did about everything possible to put herself in a very dangerous area without even bothering to look up the roadway at oncoming traffic.

I must say I agree that there will never be a "perfect" system; all systems have eventual flaws, but that being said, the autonomous vehicles have proven to have a superior record compared to human operated systems are are not tempted to answer phones, engages in texting, driving drunk or high, and a host of other habits that make roadways dangerous .... and for those of us that are getting long in the tooth they provide a degree of "freedom" that won't easily be replaced!
 
If a "improperly tuned sensor" resulted in this, then that indicated how fragile self driving vehicles are. Imagine if these vehicles were common, how often a mis-adjusted sensor would cause problems.

I got my car fixed, and the tech (a dealer technician) forgot to readjust the sensor with the ECU. If that were a self driving vehicle, that oversight would have possibly killed someone.

The idea of self driving cars is WAY to far ahead of technology, and is being pursued recklessly.
 
So, I was right. It was avoidable. Still don't understand how they find the driver at no fault even though they are clearly staring at their phone while their only fuqing job is to drive the car.

Because jaywalkers assume liability when they jaywalk.
 
Because jaywalkers assume liability when they jaywalk.
Change the victim to a child. Same problem. Car and driver failed.

However the victim was not a child. I am not holding Uber blameless, but I am not placing the entire blame on them either.

Playing the blame game is not going to fix anything.

From the connections I have with Uber, what I have heard is that the people who have been working on the autonomous cars were devastated when they heard this happen and they want to do everything they can to prevent this from happening ever again.

As others have commented already, this is something built by humans and it will not be perfect. In other words, stuff can still happen. We got to live and learn from our mistakes and keep moving forward. Sadly some come of the loss of a life or lives.
 
Change the victim to a child. Same problem. Car and driver failed.

Jaywalkers assume liability. Change the jaywalker to a child and you have the exact same legal situation. Source: kids have been hit and killed jaywalking in my state...drivers don't get charged, because jaywalking (unless the driver does a hit & run). Point is, if the jaywalker doesn't create the opportunity for the accident, the accident doesn't happen, whether or not the driver was distracted or the technology failed.
 
I've said it several times already here you guys are probably sick of hearing it... but we have additional sensor technology that could be used to help prevent this stuff from happening. Thermal images would have clearly seen that was a person and not a plastic bag or newspaper floating across the street by the wind. Laser scanners would have detected a human sized object in the road. In the end though, all the decisions are being handled by software, built by humans; so it will never be perfect.

Lidar is already a common self-driving car sensor. As is Radar, and computer vision. I do not know about thermal sensors, but I do know that is very difficult and expensive to get a good thermal image that a computer can reliability interpret. There is a reason why FLIR sensors still rely on human operators to look at their output, and don't really have any way to automate their data processing.
 
Lidar is already a common self-driving car sensor. As is Radar, and computer vision. I do not know about thermal sensors, but I do know that is very difficult and expensive to get a good thermal image that a computer can reliability interpret. There is a reason why FLIR sensors still rely on human operators to look at their output, and don't really have any way to automate their data processing.
Do you know if lidar is used for the Uber automated vehicles? And thermal sensors have come down in price considerably, however one appropriate for this usage scenario may still be quite expensive due to the necessary resolution and frame rate. Regardless if it could be helping to prevent collisions then perhaps they should be trying it. I do understand where you are coming from though with regards to interpreting them thermal related data and the computer being programmed well enough to spot a true danger and not have a bunch of false positives. Perhaps self driving cars just are not a good option right now. I certainly don't think they are.
 
Cadillac used to have (maybe they still do) have an infrared 'viewer' at the bottom of the windshield (kind of like a heads up display). The heat signature would show at night people or animals on the side of the road giving you much more warning of a potential hazard than just the headlights. I'm not sure if it's still around or if it was discontinued or had problems of some kind.
 
Jaywalkers assume liability. Change the jaywalker to a child and you have the exact same legal situation. Source: kids have been hit and killed jaywalking in my state...drivers don't get charged, because jaywalking (unless the driver does a hit & run). Point is, if the jaywalker doesn't create the opportunity for the accident, the accident doesn't happen, whether or not the driver was distracted or the technology failed.
So having failing technology and bad drivers on the road is alright as long as nothing is in the street where it shouldn't be... This could of been a sofa dropped out of Jobob's pickup and then you got a driver who will probably claim injury and sue Uber, a 2,000lb cow ready to crush your windshield in on you while you read a book and drive...
 
Do you know if lidar is used for the Uber automated vehicles? And thermal sensors have come down in price considerably, however one appropriate for this usage scenario may still be quite expensive due to the necessary resolution and frame rate. Regardless if it could be helping to prevent collisions then perhaps they should be trying it. I do understand where you are coming from though with regards to interpreting them thermal related data and the computer being programmed well enough to spot a true danger and not have a bunch of false positives. Perhaps self driving cars just are not a good option right now. I certainly don't think they are.
I'm not just talking about the monetary cost of hardware, but its size and power requirements. There are also significant engineering challenges; picking out objects from the thermals rising from the road surface; thermals that will change throughout the day, different surfaces, climates, etc.
 
Because jaywalkers assume liability when they jaywalk.
Change the victim to a child. Same problem. Car and driver failed.

However the victim was not a child. I am not holding Uber blameless, but I am not placing the entire blame on them either.

Playing the blame game is not going to fix anything.

From the connections I have with Uber, what I have heard is that the people who have been working on the autonomous cars were devastated when they heard this happen and they want to do everything they can to prevent this from happening ever again.

As others have commented already, this is something built by humans and it will not be perfect. In other words, stuff can still happen. We got to live and learn from our mistakes and keep moving forward. Sadly some come of the loss of a life or lives.

Actually, it is their fault. When they decided to avoid having thermal imaging and lidars, they've basically said: "We're gonna make this car equally sensory blind as an average human."

Except that a human driver has actual intelligence. While the car doesn't. So if they want to beat a human driver, they need to give the car a lot more sensory input than an average human gets. But they decided to go cheap.

I bet they were devastated by running over a woman, but what did they expect?? It's them being such cheapos that caused the accident. Either they have to increase the intelligence of the car (to make higher quality decisions with less info) or they have to provide more info. With enough info even a stupid brain can draw good conclusions (explaining why governments want to collect all the info they can about us).

So, it's the old game. You save money, you lose quality. Then you cry: "I didn't want this to happen". What did you expect? Someone should be fired there. A manager, for a start. Not only fired, but charged.
 
Last edited:
So having failing technology and bad drivers on the road is alright as long as nothing is in the street where it shouldn't be... This could of been a sofa dropped out of Jobob's pickup and then you got a driver who will probably claim injury and sue Uber, a 2,000lb cow ready to crush your windshield in on you while you read a book and drive...

If you're going to strawman, at least do it with a little bit of knowledge of traffic laws.

If you improperly secure a load, it breaks free, and someone smashes into it moments later because they were distracted with their phone/radio/etc...

The person who improperly secured the load assumes liability.

Just like if a jaywalker strolls into the middle of the street at night and gets taken out by a car.

The only time this changes is in the event that the driver of the impacting vehicle creates the situation that leads to the accident. For instance, hitting a pedestrian at a crosswalk when the ped has the right of way, or failing to slow down in a school zone and hitting someone because of distracted driving.

Further responses in this thread will be ignored because you aren't willing to discuss this topic honestly.

See:
If the jaywalker doesn't create the opportunity for the accident, the accident doesn't happen, whether or not the driver was distracted or the technology failed.
So having failing technology and bad drivers on the road is alright as long as nothing is in the street where it shouldn't be...
 
Actually, it is their fault. When they decided to avoid having thermal imaging and lidars, they've basically said: "We're gonna make this car equally sensory blind as an average human."

Except that a human driver has actual intelligence. While the car doesn't. So if they want to beat a human driver, they need to give the car a lot more sensory input than an average human gets. But they decided to go cheap.

I bet they were devastated by running over a woman, but what did they expect?? It's them being such cheapos that caused the accident. Either they have to increase the intelligence of the car (to make higher quality decisions with less info) or they have to provide more info. With enough info even a stupid brain can draw good conclusions (explaining why governments want to collect all the info they can about us).

So, it's the old game. You save money, you lose quality. Then you cry: "I didn't want this to happen". What did you expect? Someone should be fired there. A manager, for a start. Not only fired, but charged.

First, these cars do sometimes have Lidar - but it is far from an end-all-be-all solution, and is almost exclusively used on models that will be operating on closed courses (or at least courses without other autonomous cars using lidar). Readings of lidar bounce-backs can be confused by difference surface materials, by the weather, by unusual structural geometries, by other lidar sensors operating in proximity. It is a significant engineering challenge to add lidar, and even more so to add thermal sensors. Not adding them was an engineering decision, because every hour spent trying to get these unreliable and imperfect technologies working, is an hour that could be spent improving something reliable and better-developed like radar or machine vision.

It would have been more irresponsible for the engineers to have added these immature technologies, and expected them to perform like more mature ones.
 
First, these cars do sometimes have Lidar - but it is far from an end-all-be-all solution, and is almost exclusively used on models that will be operating on closed courses (or at least courses without other autonomous cars using lidar). Readings of lidar bounce-backs can be confused by difference surface materials, by the weather, by unusual structural geometries, by other lidar sensors operating in proximity. It is a significant engineering challenge to add lidar, and even more so to add thermal sensors. Not adding them was an engineering decision, because every hour spent trying to get these unreliable and imperfect technologies working, is an hour that could be spent improving something reliable and better-developed like radar or machine vision.

It would have been more irresponsible for the engineers to have added these immature technologies, and expected them to perform like more mature ones.
I completely disagree... but that's ok. We're not here to be friends, but just you wait... I'm calling it now. Accidents like this will continue to happen over time, and either the tech will be completely banned or abandoned; or they will start adding and using additional sensor technology to help prevent these accidents. I'm betting on the latter. It will be fun to see who is right. I'll see you again in 10 years to gloat. Actually never mind, I'll forget; in fact I already have.
 
I completely disagree... but that's ok. We're not here to be friends, but just you wait... I'm calling it now. Accidents like this will continue to happen over time, and either the tech will be completely banned or abandoned; or they will start adding and using additional sensor technology to help prevent these accidents. I'm betting on the latter. It will be fun to see who is right. I'll see you again in 10 years to gloat. Actually never mind, I'll forget; in fact I already have.
... so your argument is "technology will improve over 10 years"? What a concept.

You don't add new sensors until two things happen:
1) you've finished adding the previous sensors, including getting them working to their full potential
2) the new sensors are technologically ready

We're still on #1 with machine vision and radar, and at less-than-#1 for lidar and thermal. My job is literally in the field of remote sensing, automation, and feedback & controls. If you have a solution for reliably managing Doppler affects in a laser-point grid in an rtOS environment, separating signals from competing lidar sensors, or for automatically separating information from 'noise' in a thermal image, please, I encourage you to develop these into solutions for the autonomous car industry and solve these problems - you'll get rich doing so.
 
If you're going to strawman, at least do it with a little bit of knowledge of traffic laws.

If you improperly secure a load, it breaks free, and someone smashes into it moments later because they were distracted with their phone/radio/etc...

The person who improperly secured the load assumes liability.

Just like if a jaywalker strolls into the middle of the street at night and gets taken out by a car.

The only time this changes is in the event that the driver of the impacting vehicle creates the situation that leads to the accident. For instance, hitting a pedestrian at a crosswalk when the ped has the right of way, or failing to slow down in a school zone and hitting someone because of distracted driving.

Further responses in this thread will be ignored because you aren't willing to discuss this topic honestly.

See:
It doesn't matter a single fuq who tied the load down if the load tier isn't there because they drove off... If they drive off and are never seen again you still have the same problem..you going to ticket the cow for jaywalking or get mad at the guy who drove off or blame Uber because you weren't paying attention? Grow some sense man holy schnikes.

You're playing the legal blame game. that doesn't matter if your car sees a 2,000lb cow as an object not worth stopping for or should we should ticket any elk trying to jaywalk in front of these self driving cars. Elk's fault, should have adhered to traffic laws.
lmao at the "I get the last word" statement, ignore the responses like your self driving car would ignore that 2,000lb cow since it can't ticket it for jaywalking.
 
... so your argument is "technology will improve over 10 years"? What a concept.

You don't add new sensors until two things happen:
1) you've finished adding the previous sensors, including getting them working to their full potential
2) the new sensors are technologically ready

We're still on #1 with machine vision and radar, and at less-than-#1 for lidar and thermal. My job is literally in the field of remote sensing, automation, and feedback & controls. If you have a solution for reliably managing Doppler affects in a laser-point grid in an rtOS environment, separating signals from competing lidar sensors, or for automatically separating information from 'noise' in a thermal image, please, I encourage you to develop these into solutions for the autonomous car industry and solve these problems - you'll get rich doing so.
No. My argument, or rather opinion, is that the current "machine vision" tech, which is just software analyzing images from optical cameras to identify and decide on how to respond to these objects is insufficient for safe and reliable vehicle automation; and that over the coming years it will be decided that more sensor data is required. I'm not going to argue the challenges with making these additional sensor technologies work reliably as I have already admitted that is a problem... I'm just saying we're going to have to do more than what we are doing now if we want these self driving cars to be safe.
 
No. My argument, or rather opinion, is that the current "machine vision" tech, which is just software analyzing images from optical cameras to identify and decide on how to respond to these objects is insufficient for safe and reliable vehicle automation; and that over the coming years it will be decided that more sensor data is required. I'm not going to argue the challenges with making these additional sensor technologies work reliably as I have already admitted that is a problem... I'm just saying we're going to have to do more than what we are doing now if we want these self driving cars to be safe.
It sounds like we have just been saying the same thing, but in two different ways then.

I am no fan of autonomous cars being tested on public roads, just for these very reasons. Our legislative process is not fluent enough in these technologies to safely regulate it, imo, at least not right now as it is being developed.
 
It sounds like we have just been saying the same thing, but in two different ways then.

I am no fan of autonomous cars being tested on public roads, just for these very reasons. Our legislative process is not fluent enough in these technologies to safely regulate it, imo, at least not right now as it is being developed.
I don't think so... you are discussing the tech and the challenges associated with it. I'm simply saying we need more sensors or self driving cars will fail to happen full scale and accidents will continue.
 
You're playing the legal blame game.
Curious, have you ever heard of the fine "obstructing traffic flow". That fine can be given to help prevent accidents, just as much as defensive driving. The idea is to keep the roadway clear, so that defensive driving is not necessary.
 
Curious, have you ever heard of the fine "obstructing traffic flow". That fine can be given to help prevent accidents, just as much as defensive driving. The idea is to keep the roadway clear, so that defensive driving is not necessary.
You go ahead and try to ticket a few elk for that..who you going to blame when the self-driving cars smash some elk into their windshields, the traffic obstructing or yourself for assuming defensive driving is below you?
 
Back