[14:21 Sun,11.August 2019 by Thomas Richter] |
A sensor chip developed by Google promises new possibilities of controlling smartphones, but also computers, by gesture. The "Soli" project has been around for several years, but only now will it actually be available to users in the form of the Google high-end smartphone Pixel 4, which will be released in autumn, and then it will become clear whether it is suitable for revolutionizing the way we interact with technology. In the case of small screens, trying to accurately hit a small control on the screen can sometimes be complicated.
![]() Radar gesture recognition - Google&s Project Soli Functions such as skipping songs, switching on the alarm clock or muting telephone calls can be controlled simply by hand gesture. However, these functions are only the beginning, and in the future the Motion Sense control options will be more numerous. However, Motion Sense will not be available in all countries when Pixel 4 is launched, it will probably depend on the approval situation for the micro radar. Virtual control elements such as rotary or slide controls or knobs that are not representational but can be operated by gesture are possible. Visual feedback could also be used to control an entire virtual control panel. ![]() Pixel 4 offers a whole bar of sensors for gesture and face recognition, among other things. But the possibilities of universal control by gesture beyond the smartphone are also exciting. The very robust motion sensing chip could be used in all kinds of devices and could thus make buttons largely superfluous. And perhaps as an extremely mobile control panel for video editing/color grading - how about controlling virtual color wheels by gesture or navigation in a timeline? The new gesture control is demonstrated in this short clip: Basic information about the project Soli: ![]() deutsche Version dieser Seite: Gestensteuerung per Radar: Googles Project Soli - Die Zukunft des Bedieninterfaces? |
![]() |