Robot uses Kinect to see the world, obey your commands

By on November 19, 2010, 9:32 AM
Philipp Robbel, a PhD student at MIT's Personal Robotics Group, has combined the Kinect motion controller with an iRobot Create platform to create a battery powered robot that can see its environment and obey your gestured commands, according to Singulariy Hub. He used simultaneous localization and mapping code from OpenSLAM.org, some visualization packets from the Mobile Robot Programming Toolkit, and his own interaction, human detection, and gesture code.

Robbel's creation can generate detailed 3D maps of its surroundings and wirelessly send them to a host computer. It can also detect nearby humans and track their movements to understand where they want it to go. Robbel's research is aimed at creating a team of robots that could work together and find missing or trapped humans. Robbel admits there's some concerns over whether or not a distressed person's gesture commands should be obeyed but he believes there's potential for a robot that found one victim could use their motions to find the next one more quickly.

Kinect has grabbed the attention of hackers trying to enable its motion-sensing capabilities in environments that don't include an Xbox. The result has been open source Kinect drivers, multitouch capabilities, interaction with Windows 7 and Mac OS X, and 3D camera applications. Now we've got robotics. What's next?


Add New Comment

TechSpot Members
Login or sign up for free,
it takes about 30 seconds.
You may also...
Get complete access to the TechSpot community. Join thousands of technology enthusiasts that contribute and share knowledge in our forum. Get a private inbox, upload your own photo gallery and more.