Humans give specific signals when they interact, and Google wants its devices to know them well. Several Google products can read body language and are “aware of their own space.” The Nest Hub can pick up gestures. Users can snooze google’s smart alarms with just a wipe, the Pixel comes with motion sensing and even their thermostat can tell if a person is right in front of it.

All these devices that simulate “awareness” are powered by old technology, the radar. Google’s experts have managed to update, miniaturize and digitalize radar technology. They call it Soli. Soli, like any radar, emits electromagnetic waves and interprets how these waves are affected in 3D dimensions. Using deep learning, it can recognize things it has been programmed to recognize, like a hand in motion.

Related: The Pixel 7 and Pixel 7 Pro Just Leaked — Here’s What They Look Like

The Google Advanced Technology & Projects (ATAP) team is taking Soli radar technology to the limit. They want devices to be able to identify users’ intentions and their level of engagement, as well as gestures to act as commands. A simple turn or glance at a device would trigger it into action. Studying basic human body language, the team already programmed a wide range of cues and interactions. Devices “should feel like a best friend,” the senior interaction designer at ATAP, Lauren Bedal, told Engadget.

The Problem With Engagement And Digital Rewards

Soli Chip

The Google ATAP teams say that they are programming devices to be non-intrusive. Soli is now aware of people’s nonverbal aspects, such as their proximity, body language and even biosignals like heartbeat and respiration. From micro-gestures that use fingers to full hand movements or identifying if a user is just passing by, the team has created a “library of movements” which makes Soli aware.

It is a well-established fact that technology is programmed to call out people’s attention. Daily notifications on cellphones and the hours spent scrolling on endless social stories which have little to no value are just the tip of the iceberg. The tech industry has been questioned for using “rewards” to keep users plugged in. Strangely enough, deep learning, which is what powers Google’s small Soli chip, also uses “digital rewards” to reach engagement goals.

Several questions inevitably arise with this type of technology. For example, a video may stop playing if a user leaves the room and resume when they come back. But the question is, would the user have continued watching the video if it did not autoplay? Doesn’t autoplay grab the user’s attention and influence their behavior? Can Google devices really know what people want, or will devices just give people what they think they want and take it from there? Finally, while the technology does seem to have infinite uses, it can be annoying when the phone lights up just because a user moved it.