5145 Views

Report: Google’s new ASIC Revolutionizing Gesture Control

The Soli sensor is a fully integrated, low-power radar operating in the 60-GHz ISM band. In our journey toward this form factor, we rapidly iterated through several hardware prototypes, beginning with a large bench-top unit built from off-the-shelf components — including multiple cooling fans. Over the course of 10 months, we redesigned and rebuilt the entire radar system into a single solid-state component that can be easily integrated into small, mobile consumer devices and produced at scale.

 

 

The custom-built Soli chip greatly reduces radar system design complexity and power consumption compared to our initial prototypes. We developed two modulation architectures: a Frequency Modulated Continuous Wave (FMCW) radar and a Direct-Sequence Spread Spectrum (DSSS) radar. Both chips integrate the entire radar system into the package, including multiple beamforming antennas that enable 3D tracking and imaging with no moving parts.

 

Imagine an invisible button between your thumb and index fingers – you can press it by tapping your fingers together. Or a Virtual Dial that you turn by rubbing thumb against index finger. Imagine grabbing and pulling a Virtual Slider in thin air. These are the kinds of interactions we are developing and imagining.

 

Even though these controls are virtual, the interactions feel physical and responsive. Feedback is generated by the haptic sensation of fingers touching each other. Without the constraints of physical controls, these virtual tools can take on the fluidity and precision of our natural human hand motion.

Soli sensor technology works by emitting electromagnetic waves in a broad beam.

Objects within the beam scatter this energy, reflecting some portion back towards the radar antenna. Properties of the reflected signal, such as energy, time delay, and frequency shift capture rich information about the object’s characteristics and dynamics, including size, shape, orientation, material, distance, and velocity.

 

 

 

Soli tracks and recognizes dynamic gestures expressed by fine motions of the fingers and hand. In order to accomplish this with a single chip sensor, we developed a novel radar sensing paradigm with tailored hardware, software, and algorithms. Unlike traditional radar sensors, Soli does not require large bandwidth and high spatial resolution; in fact, Soli’s spatial resolution is coarser than the scale of most fine finger gestures. Instead, our fundamental sensing principles rely on motion resolution by extracting subtle changes in the received signal over time. By processing these temporal signal variations, Soli can distinguish complex finger movements and deforming hand shapes within its field.

 

 

 

The Soli software architecture consists of a generalized gesture recognition pipeline which is hardware agnostic and can work with different types of radar. The pipeline implements several stages of signal abstraction: from the raw radar data to signal transformations, core and abstract machine learning features, detection and tracking, gesture probabilities, and finally UI tools to interpret gesture controls.

The Soli SDK enables developers to easily access and build upon our gesture recognition pipeline. The Soli libraries extract real-time signals from radar hardware, outputting signal transformations, high precision position and motion data, and gesture labels and parameters at frame rates from 100 to 10,000 frames per second.

 

Summary

 

Project Soli is a new gesture-recognition technology based on radar, unlike established approaches based on visual or infrared light such as stereo cameras, structured light, or time-of-flight sensors. This novel approach, which uses small, high-speed sensors and data-analysis techniques such as Doppler, can detect fine motions with sub-millimeter accuracy. Thus, for instance, Project Soli technology enables a user to issue commands to a computer by rubbing a thumb and forefinger together in pre-defined patterns. Applications might include sensors embedded in clothing, switches that don’t require physical contact, and accessibility technology.

 

The project is headed by Ivan Poupyrev, a former scientist for Disney Imagineering who was named one of Fast Company’s “100 Most Creative People in Business 2013”. Project Soli was announced at Google I/O 2015 and generated considerable media interest. According to the official site, the team is preparing an alpha Project Soli development kit to a limited number of developers, and will begin signing people up for a larger beta release later in 2015.

 

Watch the Video

 

Recent Stories