{"id":253,"date":"2018-11-04T13:52:46","date_gmt":"2018-11-04T13:52:46","guid":{"rendered":"http:\/\/nitk.acm.org\/blog\/?p=253"},"modified":"2018-11-04T14:52:01","modified_gmt":"2018-11-04T14:52:01","slug":"project-soli-revolutionizing-the-gesture-recognition-technology","status":"publish","type":"post","link":"https:\/\/nitk.acm.org\/blog\/2018\/11\/04\/project-soli-revolutionizing-the-gesture-recognition-technology\/","title":{"rendered":"Project Soli &#8211; Revolutionizing the Gesture Recognition Technology"},"content":{"rendered":"<p><span style=\"font-weight: 400;\">We all are aware of the fact that Google has been a pioneer of a number of leading inventions and projects that have not only changed the way we perceive science and tech but it has also helped make our lives easier. Right from Google Cardboard to Driverless cars, Google glasses, using Augmented Reality in Google maps ( I mean who would have thought of this stuff ! ), Google keeps on delivering the coolest range of products. While Apple was removing the headphone jack from it\u2019s phones Google had been developing something very very awesome!<\/span><\/p>\n<p><span style=\"font-weight: 400;\">Have you ever fantasized of lowering the volume of your system or changing the time on your phone by just a certain set of gestures. Well this reminds me of the cool stuff in Iron Man. Coming back to the topic, in 2015 Google during its annual developer conference announced the world\u2019s first radar based key technology project; it would change the way we interact with wearable devices.<\/span><\/p>\n<p><span style=\"font-weight: 600;\">ABOUT PROJECT SOLI<\/span><\/p>\n<p><span style=\"font-weight: 400;\">Soli is a creation of the Google\u2019s research and development lab ATAP (Advanced Technology and Projects) headed by Dr. Ivan Poupyrev, the inventor and designer who blends the realities of digital and physical world. <\/span><\/p>\n<p><span style=\"font-weight: 400;\">Soli is a purpose-built interaction sensor that uses radar for motion tracking of the human hand. The Soli chip incorporates the entire sensor and antenna array into an ultra compact 8mm x 10mm package.<\/span><\/p>\n<p><img loading=\"lazy\" decoding=\"async\" class=\"aligncenter size-full wp-image-256\" src=\"https:\/\/nitk.acm.org\/blog\/wp-content\/uploads\/2018\/11\/786CE7CB-DAB8-449F-8BB5-B2F2368F2A53.jpeg\" alt=\"\" width=\"318\" height=\"159\" srcset=\"https:\/\/nitk.acm.org\/blog\/wp-content\/uploads\/2018\/11\/786CE7CB-DAB8-449F-8BB5-B2F2368F2A53.jpeg 318w, https:\/\/nitk.acm.org\/blog\/wp-content\/uploads\/2018\/11\/786CE7CB-DAB8-449F-8BB5-B2F2368F2A53-300x150.jpeg 300w\" sizes=\"auto, (max-width: 318px) 100vw, 318px\" \/><\/p>\n<p><span style=\"font-weight: 400;\">The concept of Virtual tools is the key to Soli interactions. Virtual tools are the gestures that mimic familiar interactions with physical tools.This metaphor makes it easier to communicate, learn and remember soli interactions. <\/span><\/p>\n<p><span style=\"font-weight: 600;\">VIRTUAL TOOL GESTURES<\/span><\/p>\n<p><span style=\"font-weight: 400;\">Imagine an invisible button between your thumb and index fingers, which you can press by tapping your fingers together. Consider a virtual dial that you can control by rubbing your thumb against your index finger or a virtual slider for controlling volume of the devices. These are the kind of interactions that Google has been familiarizing soli with. <\/span><\/p>\n<p><span style=\"font-weight: 400;\">Even these controls are virtual, the interactions feel physical and responsive. Feedback is generated by the haptic sensation of fingers touching each other. Without the constraints of physical controls these virtual controls can take on the fluidity and precision of our natural human hand motion. <\/span><\/p>\n<p><span style=\"font-weight: 600;\">HOW DOES IT WORK?<\/span><\/p>\n<p><span style=\"font-weight: 400;\">The sensor technology works by emitting electromagnetic waves in a broad beam.<\/span><\/p>\n<p><span style=\"font-weight: 400;\">Objects within the beam scatter this energy, reflecting some portion back towards the radar antenna. Properties of the reflected signal, such as energy, time delay and frequency shift, capture rich information about the object\u2019s characteristics and dynamics including size, shape , orientation, material , distance and velocity.<\/span><\/p>\n<p><img loading=\"lazy\" decoding=\"async\" class=\"aligncenter size-full wp-image-257\" src=\"https:\/\/nitk.acm.org\/blog\/wp-content\/uploads\/2018\/11\/E1A33963-6662-4A2E-8D2A-06A4ACD07722.jpeg\" alt=\"\" width=\"680\" height=\"370\" srcset=\"https:\/\/nitk.acm.org\/blog\/wp-content\/uploads\/2018\/11\/E1A33963-6662-4A2E-8D2A-06A4ACD07722.jpeg 680w, https:\/\/nitk.acm.org\/blog\/wp-content\/uploads\/2018\/11\/E1A33963-6662-4A2E-8D2A-06A4ACD07722-300x163.jpeg 300w\" sizes=\"auto, (max-width: 680px) 100vw, 680px\" \/><\/p>\n<p><span style=\"font-weight: 400;\">Soli tracks and recognizes dynamic gestures expressed by fine motions of the fingers and hand. Unlike traditional radar sensors, Soli does not require large bandwidth and high spatial resolution, in fact, Soli\u2019s spatial resolution is coarser than the scale of most fine finger gestures. The major focus is to extract subtle changes in the received signal over time. By processing these temporal signal variations, Soli can distinguish complex finger movements and deforming hand shapes within its field.<\/span><\/p>\n<p><span style=\"font-weight: 600;\">SOLI GESTURE RECOGNITION<\/span><\/p>\n<p><span style=\"font-weight: 400;\">The soli software architecture consists of a generalized gesture recognition pipeline which does not require any special hardware adaptations and can work with different types of systems without suffering compatibility issues. The pipeline implements several stages of signal abstraction &#8211; from the raw radar data to signal transformations, core and abstract machine learning features, detection and tracking. <\/span><\/p>\n<p><span style=\"font-weight: 400;\">The soli libraries extract real time signals from radar hardware, outputting signal transformations, high precision and motion data, and gesture labels at frame rates from 100 to 10000 frames per second.<\/span><\/p>\n<p><img loading=\"lazy\" decoding=\"async\" class=\"aligncenter size-full wp-image-258\" src=\"https:\/\/nitk.acm.org\/blog\/wp-content\/uploads\/2018\/11\/6247BF6A-D189-4C2C-82F3-356290B1FE1A.jpeg\" alt=\"\" width=\"638\" height=\"359\" srcset=\"https:\/\/nitk.acm.org\/blog\/wp-content\/uploads\/2018\/11\/6247BF6A-D189-4C2C-82F3-356290B1FE1A.jpeg 638w, https:\/\/nitk.acm.org\/blog\/wp-content\/uploads\/2018\/11\/6247BF6A-D189-4C2C-82F3-356290B1FE1A-300x169.jpeg 300w\" sizes=\"auto, (max-width: 638px) 100vw, 638px\" \/><\/p>\n<p><span style=\"font-weight: 400;\">The soli sensor is a fully integrated, low power radar. Over the course of 10 months of its design and development, Google greatly reduced the radar system design complexity and power consumption compared to its initial prototypes starting from a large bench top unit comprising of multiple cooling fans to a compact 8mm x 10mm chip with a perspective that it can be easily integrated into small, mobile consumer devices.<\/span><\/p>\n<p><span style=\"font-weight: 600;\">APPLICATIONS OF SOLI<\/span><\/p>\n<p><span style=\"font-weight: 400;\">The soli chip can be embedded in wearables, phones , computers , cars and IoT devices. Soli has no moving parts, it fits onto one chip and consumes little energy. It is not affected by light conditions and it works through most materials. I guess now it\u2019s our turn to think of the possibilities.<\/span><\/p>\n<p><span style=\"font-weight: 400;\">Project Soli will forever change the way we interact with our devices. The hand will be the ultimate input device. Such a revolutionary device will for sure bridge the gap between the virtual and the real world. <\/span><\/p>\n<p><span style=\"font-weight: 400;\">Also if you would like your mind to be blown do watch these<\/span><\/p>\n<p><span style=\"font-weight: 600;\">PROJECT SOLI &#8211; GOOGLE ATAP <\/span><\/p>\n<p><a href=\"https:\/\/www.youtube.com\/watch?v=0QNiZfSsPc0&amp;t=8s\"><span style=\"font-weight: 400;\">https:\/\/www.youtube.com\/watch?v=0QNiZfSsPc0&amp;t=8s<\/span><\/a><\/p>\n<p><span style=\"font-weight: 600;\">PROJECT SOLI &#8211; ALPHA DEVELOPERS SHOWCASE <\/span><\/p>\n<p><a href=\"https:\/\/www.youtube.com\/watch?v=H41A_IWZwZI\"><span style=\"font-weight: 400;\">https:\/\/www.youtube.com\/watch?v=H41A_IWZwZI<\/span><\/a><\/p>\n<p style=\"text-align: right;\"><em>&#8211; Atharva Mhetar<span style=\"font-weight: 600;\">\u00a0 \u00a0 \u00a0 \u00a0 \u00a0<\/span><\/em><\/p>\n<p><span style=\"font-weight: 600;\"> \u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0<\/span><\/p>\n","protected":false},"excerpt":{"rendered":"<p>We all are aware of the fact that Google has been a pioneer of a number of leading inventions and projects that have not only changed the way we perceive science and tech but it has also helped make our lives easier. Right from Google Cardboard to Driverless cars, Google glasses, using Augmented Reality in&#8230;<\/p>\n","protected":false},"author":1,"featured_media":0,"comment_status":"open","ping_status":"closed","sticky":false,"template":"","format":"standard","meta":{"_exactmetrics_skip_tracking":false,"_exactmetrics_sitenote_active":false,"_exactmetrics_sitenote_note":"","_exactmetrics_sitenote_category":0,"footnotes":""},"categories":[10,26],"tags":[51,52,53,50],"class_list":["post-253","post","type-post","status-publish","format-standard","hentry","category-tech","category-vidyut","tag-gesture-technology","tag-google","tag-google-inventions","tag-project-soli"],"_links":{"self":[{"href":"https:\/\/nitk.acm.org\/blog\/wp-json\/wp\/v2\/posts\/253","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/nitk.acm.org\/blog\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/nitk.acm.org\/blog\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/nitk.acm.org\/blog\/wp-json\/wp\/v2\/users\/1"}],"replies":[{"embeddable":true,"href":"https:\/\/nitk.acm.org\/blog\/wp-json\/wp\/v2\/comments?post=253"}],"version-history":[{"count":4,"href":"https:\/\/nitk.acm.org\/blog\/wp-json\/wp\/v2\/posts\/253\/revisions"}],"predecessor-version":[{"id":261,"href":"https:\/\/nitk.acm.org\/blog\/wp-json\/wp\/v2\/posts\/253\/revisions\/261"}],"wp:attachment":[{"href":"https:\/\/nitk.acm.org\/blog\/wp-json\/wp\/v2\/media?parent=253"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/nitk.acm.org\/blog\/wp-json\/wp\/v2\/categories?post=253"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/nitk.acm.org\/blog\/wp-json\/wp\/v2\/tags?post=253"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}