The next wave of computing is perceptual

Jos

Posts: 3,073   +97
Staff

Perceptual computing seems to be on a lot of people’s minds lately. From a quest to find the Ultimate Coder to the worldwide Intel Perceptual Computing Challenge to a free Perceptual Computing SDK, there are plenty of opportunities for developers to get involved in this cutting-edge exploration of how humans interact with computers.

Editor’s Note:
This is a guest post by Wendy Boswell, technical blogger/writer at Intel. She's also editor for About Web Search, part of the New York Times Company

If you can visualize controlling your computer merely by using your voice or a wave of your hand, rather than a mouse, a keyboard, or even a touchscreen, then you can see just the beginnings of what perceptual computing is capable of. Perceptual computing focuses on natural human interactions with machines in addition to those familiar control apparatuses many of us have literally grown up with: facial recognition, voice commands, gesture swiping, etc.  Responsive computing that is individually tailored to an individual’s unique needs is really what perceptual computing is all about.

There’s a lot of really exciting stuff going on in this space, and in this article, we’re going to focus on just a few of some of the more inspiring and innovative explorations that developers are taking on.

My, Oh Myo

One of the more interesting waves of development that is coming out of the perceptual computing movement is new user interfaces. How about a wearable armband that tracks your arm’s muscle movements and controls your computer via a series of gestures?  Watch the video of Thalmic Labs’ Myo device below:

As you can see from the video, the Myo works with devices that you already have in your home or office. Presentations can be controlled with a flick of the wrist, video games reach a whole new level of interaction, and browsing the web and watching videos is a completely different experience. More about this intriguing device:

What sort of precision does the MYO have?

The MYO detects gestures and movements in two ways: 1) muscle activity, and 2) motion sensing. When sensing the muscle movements of the user, the MYO can detect changes down to each individual finger. When tracking the position of the arm and hand, the MYO can detect subtle movements and rotations in all directions!

How quickly does it detect gestures?

Movements can be detected very quickly - sometimes, it even looks like the gesture is recognized before your hand starts moving! This is because the muscles are activated slightly before your fingers actually start moving, and we are able to detect the gesture before that happens.” – Myo FAQ

Use the Force, Luke

Ever wanted to control something with your thoughts, Jedi-style? A team of researchers at the University of Minnesota has put together a four-blade helicopter that can be controlled with the electrical impulses generated by our minds:

More about this amazing technology and how it works:

“The team used a noninvasive technique known as electroencephalography (EEG) to record the electrical brain activity of five different subjects. Each subject was fitted with a cap equipped with 64 electrodes, which sent signals to the quadcopter over a WiFi network. The subjects were positioned in front of a screen that relayed images of the quadcopter's flight through an on-board camera, allowing them to see the course the way a pilot would. The plane, which was driven with a pre-set forward moving velocity, was then controlled by the subject's thoughts.” – LiveScience.com, “Tiny Helicopter Piloted by Human Thoughts”

The lead researcher on this project, Bin He, realizes that there are many more uses for this new human-computer interface, including helping those suffering from paralysis:

"Our study shows that for the first time, humans are able to control the flight of flying robots using just their thoughts, sensed from noninvasive brain waves," said Bin He, lead scientist behind the study and a professor with the University of Minnesota's College of Science and Engineering. "Our next goal is to control robotic arms using noninvasive brain wave signals….with the eventual goal of developing brain-computer interfaces that aid patients with disabilities or neurodegenerative disorders."

Gesture recognition and Wi-Fi

You’ve probably heard of smart houses, right? Researchers in the University of Washington department of computer science and engineering are going a few steps further with that concept, linking existing wireless technology that most people have in their homes with gesture recognition:

"Forget to turn off the lights before leaving the apartment? No problem. Just raise your hand, finger-swipe the air and your lights will power down. Want to change the song playing on your music system in the other room? Move your hand to the right and flip through the songs. University of Washington computer scientists have developed gesture-recognition technology that brings this a step closer to reality. They have shown it's possible to use Wi-Fi signals around us to detect specific movements without needing sensors on the human body or cameras. By using an adapted Wi-Fi router and a few wireless devices in the living room, users could control their electronics and household appliances from any room in the home with a simple gesture." – Washington.edu, “Wi-Fi signals enable gesture recognition throughout entire home”

Watch a video of this technology in action below:

This technology, called WiSee, doesn’t require a complex installation of cameras or sensors in every room; it merely uses existing technology (wireless signals) in a completely new and innovative way. All sorts of household tasks could conceivably be simplified using this technology.

What’s next in perceptual computing?

Innovating on existing technology seems to be a theme with all three of the projects highlighted in this article, with new ways for humans and computers to interact at the forefront.

Recently, Valve’s Gabe Newell sat down with the Intel video team to talk about his impressions of perceptual computing as related to game development. Check out the video below for his views on how new kinds of input – like measured heart rate, perceived emotional state, contextual environmental cues, hand and eye movements, etc. – are shaping where new gaming experiences are going.

What do you think is coming next in perceptual computing? Share with us in the comments.

Permalink to story.

 
Gabe is starting to look like Santa Claus...

If only Steam Summer Sale could be much cheaper. :D
 
I find pressing a button to be quite okay. I have a solar powered keyboard too so no worries about batteries etc. So why do I need or want any of this stuff ?
 
I just noticed it today:
is the "controller" worn by tom cruise an advanced bowling accessory? :)
 
You almost ruined it for me :eek: , but I checked - Tom Cruise's controller would (as a bowling glove) protect the wrong fingers.
 
The mouse and keyboard work pretty well. I don't want to have to wave my arm around to use my PC, and I sure don't want my entire office to have voice operated computers.

If there was one technology I'd like to see it's eye-tracking. I can move my eyes faster and more accurately than I can move a mouse cursor. Split my spacebar into 3 keys, make the middle a space, the right key is a left click and the left key is a right click...(or reverse it if you prefer) and let my eyes replace my movement of the mouse. I can then use the keyboard and 'mouse' without taking my hands off the keyboard. Moving my eyes won't affect what I'm typing because I have to still click with my fingers.
 
"The mouse and keyboard work pretty well. I don't want to have to wave my arm around to use my PC, and I sure don't want my entire office to have voice operated computers."

Exactly. I also don't mind getting up off my fat you-know-what to turn on/off the lights either. I can see where some of these perceptual computing ideas might come in handy in a manufacturing or even medical environment. But for day-to-day home use? More annoyance (and frankly more work) than what's currently available. Can you imagine having a house full of rambunctious kids and having your appliances, lights and TV's doing all sorts of things because of the kids jumping around and yelling? LOL...
 
In the meantime, we have been fed promises from Leap Motion for how long now? I gave up waiting...
 
I can imagine it a bit this is easy for us acquainted with tech but for others who still cant figure out which remote is which where to point or what so many buttons do this is more confusion they lift the remote up to operate and something has already happened they say something again a voice speaks back to them they just won't understand.
 
I remember seeing a web article recently where someone commented that folk kept coming to them wanting an interface that looks like Minority Report's. And here we are with an article & a giant pic of tom cruise.

Issue is, it looks good in the movies, but is holding your arms out ahead of you & windmilling about for an 8hr workday anymore ergonomic or sensible over what we got now? Or is this a case of clueless point-haired people saying "that tom cruise movie sure had a cool looking computer, give us that!"
 
Huh really? yeah a few months Ray Kurzwell(google) said in 2045 they can upload your brain onto a computer and modify your dna through computer code...This sounds crazy but really look it up.
 
Back