Gesture Control

Augmented Reality, Gestures & The Human Element

Moritz v. Grotthuss, CEO of gestigon, has an articulate understanding of the state and challenges of AR technology.  “Over the past twelve months we’ve seen the first beta deliveries of augmented and virtual reality solutions,” Grotthus explained. “In most of them, the human element is missing! For example, you don’t see your hands when you put on a pair of virtual reality glasses.”

InsideAR-SF-2015-Logo_RactangleGrotthuss, who will be in San Francisco speaking about and demonstrating gestigon technology at the Inside AR conference next month, is razor focused on enriching the user interface and making AR a more human experience.  “When you scan a room with your tablet, you may get all kinds of interesting data about the objects in the room, but the human beings are usually just displayed as blobs,” he shared. “gestigon’s mission is to change that.”

Computers recognizing human context is key to making that happen.  Gestures, facial expressions, and vocal intonations can be indicative of intent or emotion, which means AR solutions must also incorporate elements of affective computing to fully understand human context.  gestigon gives “computers the ability to recognize the human context, ranging from enabling your hands to manipulate a virtual menu that your smart glasses are displaying to providing a more complete picture on the intent of a particular Human/Machine interaction,” Grotthus explained.

To learn more about AR alongside over 3,800 other AR affectionados and to connect with gestigon, plan to be in in San Francisco May 20-21, 2015 at the Inside AR conference.  Tickets and more information are available by clicking here.