MIT researchers have discovered a way to help self-driving cars see through fog

Polycount

Posts: 3,017   +590
Staff

Autonomous vehicle technology has seen quite a few advancements over the past several years but it's still far from perfect. Indeed, while many self-driving cars now possess sensor systems capable of detecting pedestrians, road signs and other vehicles in even the worst weather conditions, there's always been one major roadblock for the industry - fog.

As many vehicles use visible light-based sensor systems to detect things like street signs and react accordingly, they often rely on having a relatively clear path to an object in order to detect it. Since fog "scatters" light, causing it to behave differently than it normally would, misty road conditions can present quite a few problems for autonomous vehicles that use those systems.

Fortunately for industry leaders, that could change soon. Researchers from MIT have discovered a way to allow visible light-based sensor systems to more accurately detect objects in even the densest fog. MIT's system can "resolve images of objects and gauge their depth" at a range of up to 57 centimeters, whereas the average human can only see roughly 36 centimeters.

That number may not seem terribly impressive at first but researchers simulated "far denser" fog in their tests than anything the average human driver would ever have to worry about. For the sake of comparison, ordinary fog typically allows drivers to see roughly 30 to 50 meters.

It's also worth noting most other self-driving cars possess sensor systems that actually perform worse than standard human vision in foggy conditions, so MIT's solution could be a significant step forward for the industry.

The details regarding how MIT's system works in practice are quite technical but at its core, the solution lies in statistics. After recognizing the arrival time of light reflected off of fog particles adhered to a single statistical pattern -- known as "gamma distribution" -- regardless of fog density, researchers were able to develop a system that could account for realistic fog variations over time.

In other words, fog becoming denser or lighter over time will not completely throw off MIT's sensors - rather, it would seem they can adapt accordingly.

Permalink to story.

 
Are you getting centimetres and metres mixed up there.. or am I just misunderstanding the measurements?

"resolve images of objects and gauge their depth" at a range of up to 57 centimeters, whereas the average human can only see roughly 36 centimeters.

is it meant to be:

"resolve images of objects and gauge their depth" at a range of up to 57 meters, whereas the average human can only see roughly 36 meters?

That would make more sense then as you read through the article
 
What is the reason for not using some form of Radar for the job of cutting through fog to see cars and people please ? There must be a good reason for not using it.
 
Are you getting centimetres and metres mixed up there.. or am I just misunderstanding the measurements?

"resolve images of objects and gauge their depth" at a range of up to 57 centimeters, whereas the average human can only see roughly 36 centimeters.

is it meant to be:

"resolve images of objects and gauge their depth" at a range of up to 57 meters, whereas the average human can only see roughly 36 meters?

That would make more sense then as you read through the article
No, if you read through the article, the next paragraph explains everything:
"That number may not seem terribly impressive at first but researchers simulated "far denser" fog in their tests than anything the average human driver would ever have to worry about. For the sake of comparison, ordinary fog typically allows drivers to see roughly 30 to 50 meters."
 
Ok, what about sonar? Can't 'see' trafic signs, but still can navigate, and with intelligent roads, there's no need to 'see' traffic signs.
 
Ok, what about sonar? Can't 'see' trafic signs, but still can navigate, and with intelligent roads, there's no need to 'see' traffic signs.
Dude, you know I hate to be a downer, ( ;) ), but many to most of the municipalities in the US, can't afford to fix a Winter's potholes. So now, you're suggesting we make all the country's roads "smart", to humor these a**holes and their dreams of getting filthy rich, by taking away what should be, one of a human beings blessings and pleasures, the joy of driving his own damned car.

Given that commercial grade, weatherproof electronics are fabulously expensive, look forward to some poor slob knocking down a stop sign featuring GPS transmission and telemetry, then getting a bill for at least five grand's worth of damage.
 
Last edited:
Ok, what about sonar? Can't 'see' trafic signs, but still can navigate, and with intelligent roads, there's no need to 'see' traffic signs.
Dude, you know I hate to be a downer, ( ;) ), but many to most of the municipalities in the US, can't afford to fix a Winter's potholes. So now, you're suggesting we make all the country's roads "smart", to humor these a**holes and their dreams of getting filthy rich, by taking away what should be, one of a human beings blessings and pleasures, the joy of driving his own damned car.

Given that commercial grade, weatherproof electronics are fabulously expensive, look forward to some poor slob knocking down a stop sign featuring GPS transmission and telemetry, then getting a bill for at least five grand's worth of damage.
Exactly. when these silicon valley types decide to use their vast fortune to redo every single road, more power to them. But that had better be a public gift, no toll roads.

And after the uber incident, autonomous vehicles are being questioned more now then ever before. The video released showed the car had plenty of time to see the pedestrian before she was hit.

Rebuilding our infrastructure for "SMART" anything, especially when tech moves so quickly, is a stupid idea hands down.
 
Are you getting centimetres and metres mixed up there.. or am I just misunderstanding the measurements?

"resolve images of objects and gauge their depth" at a range of up to 57 centimeters, whereas the average human can only see roughly 36 centimeters.

is it meant to be:

"resolve images of objects and gauge their depth" at a range of up to 57 meters, whereas the average human can only see roughly 36 meters?

That would make more sense then as you read through the article
No, if you read through the article, the next paragraph explains everything:
"That number may not seem terribly impressive at first but researchers simulated "far denser" fog in their tests than anything the average human driver would ever have to worry about. For the sake of comparison, ordinary fog typically allows drivers to see roughly 30 to 50 meters."

There's still the issue of "centimeters" vs. "meters". The number 36 may be between the range of 30 & 50...but I guarantee that if you lay down measuring tape & mark off where you hit 36 centimeters, 30 meters, & 50 meters, the mark for 36 centimeters will be far, far away from the 30-meter & 50-meter marks.
 
Ok I had to read it a couple of times to make it sink into to the dense fog of my brain matter... The testing fog is thicker... got it
 
Back