Google’s upcoming Pixel device might feature a certain chip that’s not found in any other smartphones today. This new chip just might allow the new smartphone to do away with the buttons, and might even allow users to use it without touching the display.

Back in 2015, Google unveiled a new chip that uses radar technology to detect movements and gestures made by the human hand. This chip, called Project Soli, is a small chip that detects even the slightest of movements and finger gestures, allowing it to be used in a variety of functions and devices. That was four years ago. Now, 9To5Google claims to have heard that it is coming to the Google Pixel 4 device.

A report from XDA Developers seems to corroborate 9To5Google’s claim. XDA said it found an in-development feature in the Android Q beta. These features come in the form of two new gestures -- “Skip” and “Silence” -- that require a new “aware” sensor that is not present or defined in an Pixel device currently available in the market.

XDA said the two new gestures, which sound like they’re meant for media playback control, aren’t complete yet and thus cannot be tested. The new gestures, along with the new sensor, could very well be included in the upcoming Pixel 4 device. What’s more, the “aware” sensor indicated could very well be the Soli chip, which is known to detect gestures.

Google Pixel 3 These are some of the best smartphones to give as Christmas gifts this year. Photo: TIMOTHY A. CLARY/Contributor/Getty Images

What makes it so exciting?

The thought of having the Project Soli chip inside the upcoming Google Pixel 4 device should make people interested in technology, and other smartphone makers, excited.  Why is that? It’s because it will allow people to use their smartphones in ways that smartphones have never been used before, BGR noted.

A recently approved Apple patent, for example, showed that the Cupertino tech giant is working on some kind of technology that will allow future iPads to detect hover gestures. This new technology, if released, will allow users to use their Apple device without touching the screen. It might, however, require people to use an Apple pencil.

Project Soli, on the other hand, takes that a few notches up and allows people to do a lot of things on their devices using various hand and finger movements. Using the index finger to “press” the thumb, for example, results in “pressing” a virtual button. Sliding the thumb on the index finger, on the other hand, might result in adjusting a volume slider.

Watch the video below to get an idea on how Project Soli might work.