Ads

PropellerAds

Google built a tiny radar system into a smartwatch for gesture control.

"How are you going to interact with an invisible computer?"

When you hear a question like that posted in a conference room at a major tech corporation like Google, you expect you're going to be in for an hour or two of technophizing with few tangible results at the end of it.

But then somebody sets a smartwatch on the table in front of you. You snap your fingers in the air just a couple of inches away from it. And the digital watch face starts spinning.

Ivan Poupyrev, who posed that question (and many more) works at Google's ATAP research lab and is the technical project lead for Project Soli, which is designed to prove that we can embed tiny radar chips into electronics so that we can use minute hand gestures to control the digital world around us. Why on earth would you want radar in a smartwatch?

To prove that you can interact with an invisible computer. Duh.


ATAP, which stands for Advanced Technologies and Projects, is a division within Google that's at a crossroads. It was formerly led by Regina Dugan of DARPA fame, and her influence led the division to pursue technologies ranging from modular phones (Project Ara) to real-time 3D mapping (Tango) to cinematic, live-action virtual reality movies (Spotlight Stories). Dugan left for Facebook earlier this year, however, and so it was an open question whether the projects she left behind will continue. Tango has "graduated" into Google, while Ara seems mired in the muck.

But the Jacquard touch-sensitive fabric project and Soli are still at ATAP, and Soli, at least, has a new and singular goal: create both the industry and the design language for radar-enabled consumer electronics. That's why Poupyrev directed his team to do more than just experiment, but to prove that radar can work in a smart watch.

"IF YOU CAN PUT SOMETHING IN A SMARTWATCH, YOU CAN PUT IT ANYWHERE"

"If you can put something in a smartwatch, you can put it anywhere," Poupyrev says. So ATAP redesigned the Soli chip to make it smaller and draw less power. And then it redesigned it to do the same thing again. And again. Finally, according to Hakim Raja, Soli's lead hardware and production engineer, the team created the tiniest of the chips you see above. It's a tiny sliver you could balance on your pinky toenail, with four antennas that provide full duplex communication for sending and receiving radar pings. The first iteration of Soli, which shipped to in a development kit, drew 1.2w of power. This one draws 0.054w, a 22x reduction.

But making a chip that tiny has drawbacks. Radar was designed to detect massive flying metal objects from miles away, not tiny millimeter movement from your fingers inches away. Until very recently, nobody bothered worrying about the power draw at this scale and nobody had to deal with figuring out what the signal would even look like when it was shrunk down this small.

Jaime Lien is the lead research engineer for Soli, and it's her job to tune the machine learning algorithms which ultimately get hardwired into the chip. Her first realization was that it made sense to convert the spatial signal radar provides into a temporal one that makes more sense on a computer. But that was nothing compared to noise problems you run into at these tiny scales. She showed me the "glitch zoo," a huge set of screenshots of every kind of impenetrable noise that her algorithms have to find signal in. At these scales, it's impossible to do any sort of beam forming and the very electrons running through the chip have to be accounted for.

It's complicated, in other words.


By comparison to the electronics and the machine learning algorithms, actually deciding which gestures should do what seems easy. But if you think about it a minute, maybe it's not. With a touchscreen, you can see buttons and sliders. With physical switch, you can feel the snick when you flick it on. But if there's nothing but air, how do you guide the user?

IF THERE'S NOTHING BUT AIR, HOW DO YOU GUIDE THE USER?

"Is everything going to have its own interface?" Poupyrev asks. "Is every switch, every smart sprinkler, or cup going to have its own? It's going to create confusion." One of Soli's goals is to create a common design language that's easy to learn but flexible enough to control a lot of things.

Nick Gillian, lead machine engineer for Soli, walked me through the basic gestures that the team has settled on. There are essentially two zones, near and far. From far away, you don't do much (or you could wave your arm around, Kinect-like). But when you get close, Soli is able to detect finer and finer movements. So the first gesture is simple: proximity. As you move your hand closer to the watch, it lights up, showing you information and letting you know your hand is in the zone of real interaction.

YOU CAN PHYSICALLY FEEL YOUR OWN FINGERS

There, Poupyrev says, Soli is "basing this language on the metaphors which are already established in the world. We're borrowing this language from physical controls." Those controls are the dial (rubbing your finger and thumb together as though you're twisting a toothpick), the button (tap your thumb and finger together), and the slider (slide your thumb along your finger).

The nice thing about all of these gestures is that it provides the interaction with two levels of feedback: you can see the screen responding to your gestures, sure, but you can physically feel your own fingers. It sounds silly, but the touch of your fingertips against each other is literally tangible.


But why even bother? "On a really simple level, it means you get to use your entire screen for what screens are meant for: showing you stuff," says Poupyrev. If you can control your devices by wriggling your fingers in the air above them, you don't need to mess around with littering the screen with buttons. "You're pretty much wasting precious screen real estate which could be used for something useful," says Poupyrev. That's meaningful on a screen as small as a watch. The team could just "design for the eye ... rather than for the finger."

"DESIGN FOR THE EYE ... RATHER THAN FOR THE FINGER."

That's interesting for a touchscreen, but Poupyrev's vision is something larger: as computers become smaller, they're eventually going to be everywhere. When than happens, we will need a way to interact with them. There are a lot of bets on using voice for this, but why not use our hards too?

As humans, we evolved to use our hands as the primary way we interact with the real world. And it's only as we start to build technology that we have to stack additional sensors and buttons and touchscreens, because we cannot interact with the digital content directly.

Soli and Jacquard are ATAP's projects to help everybody get ready for a future where everything is a computer. That future could be a hellscape of screens and beeps, or it could be more naturalistic. Where you can touch your sleeve to start your music, you can speak out loud to search the internet, and snap your fingers to start a program.

"We're coming back to this humanity, where the digital becomes part of the real," Poupyrev says. But that's a heady dream, and usually you wake up from a dream and get back to the real world. To keep his dream from fading away and being forgotten like most dreams, Poupyrev has to do something now.

He and his team have to make that watch.


The first Soli prototypes are an LG Watch Urbane and a JBL speaker, and neither is anywhere near close to being a consumer product. Speakers vibrate to make sound, so there are hard problems to solve when you're trying to add a chip that detects millimeter-scale movements. The watch still has power and interaction issues to suss out.

But ATAP isn't just building these to prove they're possible in theory. They're doing it in partnership with LG, Qualcomm, JBL, and others to prove to these companies that they can put them in real, shipping consumer products.

When Dugan was running ATAP, her mantra was that each team had to produce a "demonstration at convincing scale" within two years. Every one of those words has a driving force behind it, but even though Soli is two years into its project, it's still going. Poupyrev says that he believes that Soli has already achieved that demonstration. It's gotten the chips made, it's created software that could work on those (or any) chips, it's created a design language that could extend to lots of devices.

ATAP IS SERIOUS ABOUT PUTTING THIS TECHNOLOGY ON STORE SHELVES

Now the goal is to work with consumer product companies to put that whole stack into a shipping product. It's not quite the way that ATAP worked before, but Poupyrev says "we are serious" about the goal of putting this technology on store shelves. That's an entirely different scale than what ATAP has done before, and to convince us that it's possible the Soli team will need to demonstrate that it can make as much progress in the next two years as it has in the last two.

No comments