Google gets permission to operate Soli gesture detection radar at higher powers

Greg S

Posts: 1,607   +442
The big picture: Bringing wireless products to market requires filing a lot of regulatory paperwork. Google has been working on Soli, a small radar module for gesture detection and was granted FCC approval to use the device with far fewer restrictions.

Beginning in 2015 as part of Google's Advanced Technology and Projects group, Soli has grown into a 3D hand sensing that uses radar to quickly detect intricate hand gestures. In a late FCC ruling, Google has been granted permission to operate Soli at higher power levels and on aircraft.

Pressing your thumb and index finger together can simulate a virtual button push. Rubbing your fingers together can replicate turning a dial. Sliding your thumb along your index finger acts as a virtual slide control.

During the course of development, Soli started off as a box full of off-the-shelf components. Google engineers went through numerous iterations before condensing the system down to an 8mm by 10mm chip complete with integrated antennas operating in the 60GHz ISM band.

Now that regulatory restrictions on output power have been lifted, Google has a bit more freedom to get some real world testing done. Applications include wearables, smartphones, automotive use, and IoT devices.

For tracking small motions at relatively close range, one would expect that high resolution of radar would be required. However, Google has actually implemented a fairly coarse spatial resolution. Instead of directly tracking physical observations, movements are detected by looking at changes in the received signal which can be correlated to recognized actions. A group of students even used Soli to determine the composition of objects.

Assuming that a device has enough processing power, the Soli SDK allows for effect frame rates between 100 and 10,000 FPS. This truly real-time tracking has potential to create more natural interactions across a variety of Google's intended applications.

Permalink to story.

 
Am I the only one believing that this can also be achieved visually and thus without any radiation? This would be more expensive, though. Those little radar chips can be bought for cents in China (movement detectors, etc.).
 
Am I the only one believing that this can also be achieved visually and thus without any radiation? This would be more expensive, though. Those little radar chips can be bought for cents in China (movement detectors, etc.).
The processing required for 3D visuals to get accurate enough spacial data vs how relatively simple radar is would be why they don't. And then the equipment is quite a bit more expensive (as you mentioned).

The cheapest solution is by having something "simple" bounce off of the hand/object back to the sensor. And cheap means that adoption of this tech is far more likely.

Ps. Technically visual light is a form of radiation. It's just different wavelengths :p
 
60GHz is well away from the resonant frequency of water, so no risk there, and it is far too large of a wavelength of begin interacting with cells (never mind DNA), so no risk there either.

I do wonder how they are handling the issue of clutter though?
 
60GHz is well away from the resonant frequency of water, so no risk there, and it is far too large of a wavelength of begin interacting with cells (never mind DNA), so no risk there either.

I do wonder how they are handling the issue of clutter though?
The problem is the interference. You don't want that with "input devices". You want them to be reliable.
 
The problem is the interference. You don't want that with "input devices". You want them to be reliable.

That is literally what "clutter" is... interference from background objects and the medium the RF is passing through.
 
Back