Google’s New Tech Can Read Your Body Language—Without Cameras

0

The company’s ATAP research team is using radar to help computers respond to your movements, like turning off a TV if…

What if your computer decided not to blare out a notification jingle because it noticed you weren’t sitting at your desk? What if your TV saw you leave the couch to answer the front door and paused Netflix automatically, then resumed playback when you sat back down? What if our computers took more social cues from our movements and learned to be more considerate companions?

It sounds futuristic and perhaps more than a little invasive—a computer watching your every move? But it feels less creepy once you learn that these technologies don’t have to rely on a camera to see where you are and what you’re doing. Instead, they use radar. Google’s Advanced Technology and Products division—better known as ATAP, the department behind oddball projects such as a touch-sensitive denim jacket—has spent the past year exploring how computers can use radar to understand our needs or intentions and then react to us appropriately.

This is not the first time we’ve seen Google use radar to provide its gadgets with spatial awareness. In 2015, Google unveiled Soli, a sensor that can use radar’s electromagnetic waves to pick up precise gestures and movements. It was first seen in the Google Pixel 4‘s ability to detect simple hand gestures so the user could snooze alarms or pause music without having to physically touch the smartphone. More recently, radar sensors were embedded inside the second-generation Nest Hub smart display to detect the movement and breathing patterns of the person sleeping next to it. The device was then able to track the person’s sleep without requiring them to strap on a smartwatch.

The same Soli sensor is being used in this new round of research, but instead of using the sensor input to directly control a computer, ATAP is instead using the sensor data to enable computers to recognize our everyday movements and make new kinds of choices.

“We believe as technology becomes more present in our life, it’s fair to start asking technology itself to take a few more cues from us,” says Leonardo Giusti, head of design at ATAP. In the same way your mom might remind you to grab an umbrella before you head out the door, perhaps your thermostat can relay the same message as you walk past and glance at it—or your TV can lower the volume if it detects you’ve fallen asleep on the couch.

Radar Research

Google ATAP demo

A human entering a computer’s personal space.

Courtesy of Google

Giusti says much of the research is based on proxemics, the study of how people use space around them to mediate social interactions. As you get closer to another person, you expect increased engagement and intimacy. The ATAP team used this and other social cues to establish that people and devices have their own concepts of personal space. 

Radar can detect you moving closer to a computer and entering its personal space. This might mean the computer can then choose to perform certain actions, like booting up the screen without requiring you to press a button. This kind of interaction already exists in current Google Nest smart displays, though instead of radar, Google employs ultrasonic sound waves to measure a person’s distance from the device. When a Nest Hub notices you’re moving closer, it highlights current reminders, calendar events, or other important notifications. 

Source

About The Author

Leave a Reply

Your email address will not be published. Required fields are marked *