Project Soli – Revolutionizing the Gesture Recognition Technology

We all are aware of the fact that Google has been a pioneer of a number of leading inventions and projects that have not only changed the way we perceive science and tech but it has also helped make our lives easier. Right from Google Cardboard to Driverless cars, Google glasses, using Augmented Reality in Google maps ( I mean who would have thought of this stuff ! ), Google keeps on delivering the coolest range of products. While Apple was removing the headphone jack from it’s phones Google had been developing something very very awesome!

Have you ever fantasized of lowering the volume of your system or changing the time on your phone by just a certain set of gestures. Well this reminds me of the cool stuff in Iron Man. Coming back to the topic, in 2015 Google during its annual developer conference announced the world’s first radar based key technology project; it would change the way we interact with wearable devices.

ABOUT PROJECT SOLI

Soli is a creation of the Google’s research and development lab ATAP (Advanced Technology and Projects) headed by Dr. Ivan Poupyrev, the inventor and designer who blends the realities of digital and physical world.

Soli is a purpose-built interaction sensor that uses radar for motion tracking of the human hand. The Soli chip incorporates the entire sensor and antenna array into an ultra compact 8mm x 10mm package.

The concept of Virtual tools is the key to Soli interactions. Virtual tools are the gestures that mimic familiar interactions with physical tools.This metaphor makes it easier to communicate, learn and remember soli interactions.

VIRTUAL TOOL GESTURES

Imagine an invisible button between your thumb and index fingers, which you can press by tapping your fingers together. Consider a virtual dial that you can control by rubbing your thumb against your index finger or a virtual slider for controlling volume of the devices. These are the kind of interactions that Google has been familiarizing soli with.

Even these controls are virtual, the interactions feel physical and responsive. Feedback is generated by the haptic sensation of fingers touching each other. Without the constraints of physical controls these virtual controls can take on the fluidity and precision of our natural human hand motion.

HOW DOES IT WORK?

The sensor technology works by emitting electromagnetic waves in a broad beam.

Objects within the beam scatter this energy, reflecting some portion back towards the radar antenna. Properties of the reflected signal, such as energy, time delay and frequency shift, capture rich information about the object’s characteristics and dynamics including size, shape , orientation, material , distance and velocity.

Soli tracks and recognizes dynamic gestures expressed by fine motions of the fingers and hand. Unlike traditional radar sensors, Soli does not require large bandwidth and high spatial resolution, in fact, Soli’s spatial resolution is coarser than the scale of most fine finger gestures. The major focus is to extract subtle changes in the received signal over time. By processing these temporal signal variations, Soli can distinguish complex finger movements and deforming hand shapes within its field.

SOLI GESTURE RECOGNITION

The soli software architecture consists of a generalized gesture recognition pipeline which does not require any special hardware adaptations and can work with different types of systems without suffering compatibility issues. The pipeline implements several stages of signal abstraction – from the raw radar data to signal transformations, core and abstract machine learning features, detection and tracking.

The soli libraries extract real time signals from radar hardware, outputting signal transformations, high precision and motion data, and gesture labels at frame rates from 100 to 10000 frames per second.

The soli sensor is a fully integrated, low power radar. Over the course of 10 months of its design and development, Google greatly reduced the radar system design complexity and power consumption compared to its initial prototypes starting from a large bench top unit comprising of multiple cooling fans to a compact 8mm x 10mm chip with a perspective that it can be easily integrated into small, mobile consumer devices.

APPLICATIONS OF SOLI

The soli chip can be embedded in wearables, phones , computers , cars and IoT devices. Soli has no moving parts, it fits onto one chip and consumes little energy. It is not affected by light conditions and it works through most materials. I guess now it’s our turn to think of the possibilities.

Project Soli will forever change the way we interact with our devices. The hand will be the ultimate input device. Such a revolutionary device will for sure bridge the gap between the virtual and the real world.

Also if you would like your mind to be blown do watch these

PROJECT SOLI – GOOGLE ATAP

https://www.youtube.com/watch?v=0QNiZfSsPc0&t=8s

PROJECT SOLI – ALPHA DEVELOPERS SHOWCASE

https://www.youtube.com/watch?v=H41A_IWZwZI

– Atharva Mhetar         

                                                                                         

Leave a Reply

Your email address will not be published. Required fields are marked *