Google has been experimenting with gestures and motion detection for the past few years now. The Pixel 4 had a dedicated Soli sensor on the front that could detect certain gestures. These gestures then triggered a particular action on the phone like changing music tracks, silencing calls and alarms, etc. Essentially, Google was trying to showcase how you can perform actions on your phone without actually touching it. While Google ditched the Soli sensor on subsequent Pixel phones, it added the feature to the Nest Hub (2021).
The Nest Hub (2021) uses the Soli chip to track your sleep, detect sleeping patterns, and recognize coughing/snoring. In addition, since the Soli sensor can detect motion, it lets you use certain hand gestures to interact with the Nest Hub. At launch, the Nest Hub supported gestures to play/pause music and snooze alarms. Now, Google is working on adding another gesture to the mix.
Currently, the Nest Hub can detect two gestures — an air tap to play/pause music and a left/right swipe to dismiss timers and alarms. Now, the Google Home app has added a third gesture that will let you browse through your photos by swiping left or right in front of the screen. As reported by 9to5Google, the feature is listed on the app but is not yet live on the second generation Nest Hub.
If you’ve enrolled your device into the preview program, you will most probably have to wait for the next preview version of the software to drop before you can add this new gesture. Such gestures make a lot of sense on devices like the Nest Hub that rest on a table at your bedside or in the living room. It’s easier to navigate via gestures on a device with a large screen, and the use cases seem more practical than implementing it on a smartphone as Google did with Pixel 4.