Jason Wishnov, CEO, Founder and Lead Designer at Iridium Studios, recently sat down for a brief video interview in which he talked about Iridium’s new tactical war game titled “There Came An Echo” and how it uses the new Intel RealSense Technology (formerly Perceptual Computing) speech recognition capabilities to direct the combat and game play with recognizable voice commands, much like on a real battlefield.
Check out what Iridium has been up to lately:
First Person Shooter games, otherwise known as FPS, are very linear: you’ve got a disembodied voice in your ear telling you to go here or shoot there. Iridium Studios figured out that it would be an interesting experience to turn that around and put gamers on the other side of the game: you give commands and control the story.
Ideas are easy – it’s implementation that’s the tricky part. How do you make the user truly in charge of the gameplay? Buttons? Controls? What if you could give them voice command control, much like on a real battlefield? That’s what Iridium Studios set out to do, and they’re reporting that they’re getting up to the mid-90’s range as far as accurate voice commands carried out.
Human and computer interaction is evolving beyond the traditional input controls that we’re familiar with. Voice recognition makes it possible for our digital worlds to interact with our physical, organic worlds in meaningful ways. Many of the projects that developers are creating step across boundaries that just a few years ago would have been impossible to imagine.
If you can visualize controlling your computer by using your voice or a wave of your hand, rather than a mouse and keyboard or even a touchscreen, then you can see just the beginnings of what this technology is capable of. RealSense computing focuses on natural human interactions with machines in addition to those familiar control apparatuses many of us have grown up with: facial recognition, voice commands, gesture swiping, etc. Responsive computing that's individually tailored to an individual’s unique needs.
Are voice commands better than actual controls? Jason reports that from an accuracy-focused point of view, the traditional controls probably will win out (for now), but voice controls are quickly gaining ground. Plus, voice controls give you a much more intuitive sense of immersion in the game.
Instead of being told to run here or there, you’re actually able to give commands, and use your own battle tactics to solve problems as they come up. That’s a heady mixture that is sure to take off as the technology grows more sophisticated, and this lends itself especially well to FPS games.
RealSense technology gives developers more of an opportunity to create a greater experience for game players. Iridium Studios utilized the new RealSense SDK to construct their newest projects, and report that between updated support for voice commands and Unity integration, they were really able to accomplish a lot. Their goal? Put out great games, of course, but also to convince gamers that alternative forms of controls like voice and gesture are viable forms of control in FPS games.
What kinds of innovations are supported by this SDK? There are several, from speech recognition as seen in Iridium’s “There Came An Echo” game, to hand and finger tracking that turn gestures into actions, facial analysis as a perceptual computing component in games or other interactive applications, augmented reality by tracking 2D/3D object and incorporating them into the experience in real-time, and even background substratcion.
This SDK is absolutely free to download and developers will also want to check out the Perceptual Computing Forums, a message board dedicated to the challenges faced in developing apps using this new technology.
Whether you’re a developer or a gamer, what ideas come to mind for RealSense and “real life” integration in creating better computing experiences?