Radar-Based Gesture Recognition for Wearable Electronics
From Google's Advanced Technology And Projects (ATAP), the Project Soli team is developing a new gesture interaction sensor using radar technology. The sensor can track sub-millimeter motions at high speed and accuracy. It fits onto a chip, can be produced at scale, and built into small devices and everyday objects. The interaction sensor runs at 60GHz and can capture motions of a user's fingers at resolutions and speeds that haven't been possible before - up to 10,000 frames per second. To get there, the team had to reinterpret traditional radar, which bounces a signal from an object and provides a single return ping. The team built the first prototype, which is a 5x5mm piece of silicon.
Transcript
00:00:01 Poupyrev: My name is Ivan Poupyrev, and I work for Advanced Technology and Projects group at Google. The hand is the ultimate input device. It's extremely precise, it's extremely fast, and it's very natural for us to use it. Capturing the possibilities of the human hand was one of my passions. How could we take this incredible capability, the finesse of human actions and finesse of using our hand,
00:00:25 but apply it to the virtual world? We use radio frequency spectrum, which is radars, to track human hand. Radars have been used for many different things-- to track cars, big objects, satellites and planes. We're using them to track micro motions, twitches, of the human hand and then use that to interact with wearables and Internet of Things and other computing devices.
00:00:54 Lien: Our team is focused on taking radar hardware and turning it into a gesture sensor. Radar is a technology which transmits a radio wave towards a target, and then the receiver of the radar intercepts the reflected energy from that target. The reason why we're able to interpret so much from this one radar signal is because of the full gesture recognition pipeline that we've built.
00:01:19 The various stages of this pipeline are designed to extract specific gesture information from this one radar signal that we receive at a high frame rate. Amihood: From these strange, foreign range Doppler signals, we are actually interpreting human intent. Karagozler: Radar has some unique properties when compared to cameras, for example. It has very high positional accuracy, which means that you can sense the tiniest motions.
00:01:51 Schwesig: We arrived at this idea of virtual tools because we recognized that there are certain archetypes of controls, like a volume knob or a physical slider, a volume slider, Imagine a button between your thumb and your index finger, and the button's not there, but pressing this is a very clear action. And there's an actual physical haptic feedback that occurs as you perform that action.
00:02:15 The hand can both embody a virtual tool, and it can also be, you know, acting on that virtual tool at the same time. So if we can recognize that action, we have an interesting direction for interacting with technology. Poupyrev: So when we started this project, you know, me and my team, we looked at the project idea, and we thought, "Are we gonna make it or not? Eh, we don't know." But we have to do it.
00:02:40 Because unless you do it, you don't know. Raja: What I think I'm most proud of about our project is, we have pushed the processing power of the electronics itself further out to do the sensing part for us. Poupyrev: The radar has a property which no other technology has. It can work through materials. You can embed it into objects. It allows us to track really precise motions.
00:03:04 And what is most exciting about it is that you can shrink the entire radar and put it in a tiny chip. That's what makes this approach so promising. It's extremely reliable. There's nothing to break. There's no moving parts. There's no lenses. There's nothing, just a piece of sand on your board.
00:03:23 Schwesig: Now we are at a point where we have the hardware where we can sense these interactions, and we can put them to work. We can explore how well they work and how well they might work in products. Poupyrev: It blows your mind, usually, when you see things people do. And that I'm really looking forward to. I'm really looking forward to releasing this to the development community,
00:03:47 and I really want them to be excited and motivated to do something cool with it, right?

