Google’s Advanced Technologies and Projects unit wants you to control your smart watch with a flick of your fingers. The mobile research unit, tasked with advancing the state of mobile technology, unveiled a new project Friday morning at Google’s I/O developer conference that takes a very different approach at interaction with mobile devices.
At the center of ATAP’s Project Soli is a small chip that uses radar to measure very subtle hand gestures to interact with mobile devices. ATAP’s technical program lead Ivan Poupyrev demonstrated a prototype of the chip on stage, flicking his thumb and index fingers in the air to set the time on a watch, and playing a very simple video game.
Poupyrev said that ATAP ventured into gestures because it realized that shrinking screen sizes limit the ability of humans to interact with technology. Hand gestures seemed a much more natural fit, building on interaction with the real world we are already used to from our everyday lives. The human hand is “always with you, intuitive, easy to use, and very ergonomic,” he joked.
On a more serious note, radar-based gesture recognition could be the missing link for virtual reality headset developers trying to give users a way to interact with virtual worlds without using complicated camera set-ups to track users, or force them to wear awkward data gloves. Users may eventually be able to just wear a wrist band, or manufacturers may even be able to add a Soli-like chip to the headset itself to track hands and map gestures, allowing a more natural interaction with VR environments.
However, don’t expect your next smart watch or VR headset to listen if you snap your fingers just yet: Poupyrev said that a development kit for Project Soli will be available to developers this fall, which likely means that any products for consumers likely won’t be available for another two years at the earliest.