Human hand is the ultimate input device it’s extremely precise, it’s extremely fast and it’s very natural for us to use it, capturing the possibilities of human hand is one of the thing that project soli is based upon. Project soli is working on applying the incredible capability, the finesse of human actions and finesse of using our hand to the virtual world.
Technology behind project soli is radio frequency spectrum, which is radar to track human hand, radars has been used for many different things to track cars , big objects, satellites and planes and in project soli this technology is being used to track micro motions, twitches of human hand and then use that to interact with wearables and Internet of Things and other computing devices.
Project soli is mostly focused on taking radar hardware and turning into a gesture sensor. Radar is a technology which transmit a radio wave towards a target and then the receiver of the radar intercepts the reflected energy from that target. The reason why it is possible to interpret so much from this one radar signal is because of the full gesture recognition pipeline that Google has built for project soli. The various stages of this pipeline are designed to extract specific gesture information from one radar signal that we receive at a high frame rate. Radar has some unique properties, it has very high positional accuracy which means that you can sense the tiniest motions.
Project soli is the idea of virtual tools imagine a button between your thumb and your index finger, the button is not there but pressing your fingers is a very clear action and there’s is a natural physical haptic feedback that occurs as you perform that action. The hand can both embody a virtual tool and it can also act on that virtual tool at the same time, now just think for a second if we have the technology that can recognize these actions then we have an interesting direction for interacting with technology and that’s what Project soli is all about.
Project soli has pushed the processing power of the electronics itself further out to do the sensing part for us and now we are at the point where we have the hardware where we can sense these interactions and put them to work, we can explore how well they work and how well they might work in products, this project might bring up the change to how we interact with technology.