Through the looking glass: MIT's work is both fascinating and frightening although in actuality, this sort of stuff is probably already in advanced stages of development behind closed doors.
Researchers with MIT’s Computer Science and Artificial Intelligence Laboratory (CSAIL) have been working for the better part of a decade on refining technology that effectively grants X-ray vision. With the help of artificial intelligence, they’re one step closer to achieving their goal.
Their latest project, RF-Pose, utilizes a neural network to analyze radio signals that bounce off people’s bodies. We’ve previous seen MIT demonstrate the ability to use Wi-Fi to see through walls in this manner but with the new AI smarts, the output is far more precise.
Earlier iterations simply showed a rough outline of what sort of looked like a person via heat map-like imagery but now, a dynamic stick figure can be created that directly mimics a person’s movements. Future iterations aim to use 3D skeletons to capture even smaller micro movements, we’re told.
The technology also works in dimly lit environments and can process multiple individuals in a single scene.
MIT says the new technology could be used to monitor diseases like multiple sclerosis, muscular dystrophy and Parkinson’s, providing for a better understanding of disease progression without a patient having to wear a sensor or remember to charge a device.
RF-Pose could also be used in search-and-rescue missions to help locate survivors or even to create a new class of video games that allow people to freely move around their homes.