Using four mics and a speaker, drones can echolocate like bats

Cal Jeffrey

Posts: 4,191   +1,430
Staff member
In brief: Researchers have been studying a way to reconstruct areas surrounding a drone using sound waves. This process, known as echolocation, is used by bats and other animals to orient themselves in their environment and detect objects.

Mathematicians from Purdue and the Technical University of Munich (TUM) have been working on a signal processing method that uses four microphones and a speaker attached to a drone to “sense” the areas around the craft. The speaker emits a sound and mics attached to the four corners of the drone pick up the echo. By using algebra and geometry, it is possible to calculate the distance of an object by the time interval of the echo.

The trick is figuring out which walls or objects correspond to each calculated distance, a process known as echosorting. It is a vital part of the signal processing as otherwise, the algorithms could produce “ghost walls.” For echosorting to work, the microphones must be set up in a non-planar fashion.

Purdue University’s Associate Professor of Mathematics and Electrical and Computer Engineering Mireille Boutin says that the signal processing algorithms have applications for people, underwater vehicles, and cars.

So far, Boutin and TUM Professor of Algorithmic Algebra Gregor Kemper have successfully reconstructed the wall configurations in different rooms and published their research in the SAIM Journal on Applied Algebra and Geometry. They will continue studying different scenarios, such as “when the movement of the drone is restricted, or when the drone listens to the echoes of consecutive sounds as it is moving.”

While they have not talked about the commercialization of their algorithms, they see practical applications that could be beneficial and profitable. For instance, a compact device carried in the pocket of a blind person could warn him or her of walls or other obstacles in the way.

Permalink to story.

 
They will continue studying different scenarios, such as "when the drone listens to the echoes of consecutive sounds as it is moving.”
----------------------------------------------------------
People can do that too!

Some people (like some blind people and myself) can also see in the dark or over hills when listening to the echoes of consecutive sounds as they are moving
"Penetrating sounds like these work best"

Even the deaf can feel the penetrating echoes of this sled screaming across the lake at 3A.M.
I can wake your deaf grandmother from 3 miles away with this thing and she won't know why she feels that way, but she will surely wake up with a smile on her face

Some blind people can see (without sight) the outlines of objects (cars/walls etc) by making a clicking sound with their mouth

Echo location is not the only trick....
I can hear people on the other side of a large room, talking to the person right next to them with a live band playing at very loud levels

If they are sitting directly in front of the speakers (in line with me), their voice is carried across the room as an Amplitude Modulated signal just like AM Radio

You do not need stupid high frequency Radio or Radar to perform these feats
These are simple physics
Low Frequency Audio Physics
 
Last edited:
To me, this seems rather obvious that such a setup would work. If I am not mistaken, bat ears are asymmetrically located on a bat's head and this fact gives bats the ability to determine depth and relative position to objects based on echolocation.

... LIDAR would be considerably larger gear and use more power.
Not necessarily.

Not to mention faster waves, which would be harder to calculate differences. So detecting distances would be quicker but direction would be harder.
I agree that with light, the elapsed time would be difficult to measure in a small space; however, comparison of phase between the emitted and the received light gives substantially better results. Phase comparison is already used is various light-based devices. If it is not employed already, I bet it is also possible to use red/blue shift to determine whether the object is approaching or receding relative to the current location of the scanner.
 
Last edited:
I agree that with light, the elapsed time would be difficult to measure in a small space; however, comparison of phase between the emitted and the received light gives substantially better results. Phase comparison is already used is various light-based devices. If it is not employed already, I bet it is also possible to use red/blue shift to determine whether the object is approaching or receding relative to the current location of the scanner.
First off I agree. I was only suggesting the need for faster equipment processing the faster wave forms. Can you imagine though giving our automobiles hearing as well as all those different sight processes. We rely on our hearing way more than we would like to admit. When we hear a sound it draws our attention so that we sight in on what made the noise. We can often tell if an object is moving simply by the sound it produces. It helps us gauge their rate of speed even. Something as simple as sound waves which would take up minimal processing speeds compared to lidar and radar.

To be honest I think they all need to be used together for best results.
 
First off I agree. I was only suggesting the need for faster equipment processing the faster wave forms. Can you imagine though giving our automobiles hearing as well as all those different sight processes. We rely on our hearing way more than we would like to admit. When we hear a sound it draws our attention so that we sight in on what made the noise. We can often tell if an object is moving simply by the sound it produces. It helps us gauge their rate of speed even. Something as simple as sound waves which would take up minimal processing speeds compared to lidar and radar.

To be honest I think they all need to be used together for best results.
As I see it, it is hard to predict the processing requirements. It might appear that sound is easier to process, however, sound is also easy to produce. To me, it seems that the biggest issue would be the ratio of signal to noise, and acoustic noise from some other source might look like signal rather than noise and give a false result.

I agree that a combination of the three may produce better results.
 
Yeah OK, listen, (pun intended), bats echolocation is at ultrasonic frequencies, well above human hearing capability. In fact, some species can hear as high as 200 Khz. Ouyr hearing is down as low as 15 Khz especially after the male of our species hits in the early 20's
Dolphins use echolocation as well, and keep in mind the speed of sound through water is about mach 3.

So, using the term "speaker", is possibly a misnomer depending on n "ultrasonic transducer", to pull this off. One principle is the fact ther higher the audioequency, the more of a straight line in which it projects..

And really the only additional technique that would be required, is to use digital pulse code modulation on the radiated signal., in order for the device to distinguish its signal, from other drones around it.

Using the term "speaker", IMO, is a massive misnomer in the article, because the term "loudspeaker", technically only attaches to devices designed to output sound within the human hearing range.

I mean really, you don't want your ho,me theater to reproduce sounds that draw every dog in the neighborhood to be barking outside your door. So, the sweet spot to employ echolocation is between 100 khz to 200 khz, which is where bats and dolphins have been doing it for millions of years. Where do you think we got the idea for "SONAR".




True SONAR does work in the range of human hearing, to slightly above. Different frequencies are used for different purposes.

Here's about four feet on the topic at Wiki, if anybody is that interested or has the time to waste:

 
Last edited:
Not to mention faster waves, which would be harder to calculate differences. So detecting distances would be quicker but direction would be harder.
I almost absolutely certain you'\re right about this. Light travels ever so much faster than electricity. When you put any resistance in its path, electricity wil slow ever so slightly. In other words, I don't think an electronic device canb detect and decode a "light echo", since at close range, it has to be slower than the light reflection it's trying to measure.

Here's the speed of light in feet per second:

1 Speed of Light in Vacuum: In SI units the speed of light measured in a vacuum is 299,792,458 meters per second. 1 Foot per Second: Feet per Second.

The distance of the moon is 2 seconds (give or take) away at light speed. At those distances, you start to be able to make accurate measurements using light.
 
I agree that with light, the elapsed time would be difficult to measure in a small space; however, comparison of phase between the emitted and the received light gives substantially better results. Phase comparison is already used is various light-based devices. If it is not employed already, I bet it is also possible to use red/blue shift to determine whether the object is approaching or receding relative to the current location of the scanner.
Guys this is a tech forum, ca't we call the phenomenon by its given name, "Doppler effect"?

The whole blue to red shift s most notoriously used to determine the rate at which galaxies tens of thousands of light years away are receding

I believe Doppler effect is both workable and applicable at any frequency, otherwise dolphins and bats would have flashlights in their heads, if light phase shift was the most practical approach.
 
Last edited:
Back