Gesture based control for smartphone

Gesture based control for smartphone
When it comes to consumers, the smartphone is the quintessential must-have device. Everyone has one and the number of users is only growing with each passing day. Their prevalence has meant that the professional AV industry has had to sit up, take note and start figuring out how to integrate smartphone usage into projects and installations.

Smartphones are increasingly being used as the user-interface for control applications especially in the home automation market. Savant for example offer their control applications through Apple smart devices. They are also being integrated into collaborative products to provide end users with the comfort of BYOD. With this in mind it may be important to take a look at how users interact with their smartphones.

This is exactly what a team of researchers at the University of Washington is doing. Rather than being stuck with capacitive touch input, they want to enable gesture control for smartphones. Gesture control is available at present with the help of camera tracking. But this solution drains the battery life of the device and is contingent on a clear view of the user’s hand being present.

The engineers at University of Washington have developed a new form of low- power wireless sensing technology that could soon contribute to this growing field by letting users “train” their smartphones to recognize and respond to specific hand gestures near the phone.

Developed in the labs of Matt Reynolds and Shwetak Patel, UW associate professors of electrical engineering and of computer science and engineering, the technology uses the phone’s wireless transmissions to sense nearby gestures. It therefore works even when a device is out of sight.

Reynolds speaks about the development: “Today’s smartphones have many different sensors built in, ranging from cameras to accelerometers and gyroscopes that can track the motion of the phone itself. We have developed a new type of sensor that uses the reflection of the phone’s own wireless transmissions to sense nearby gestures, enabling users to interact with their phones even when they are not holding the phone, looking at the display or touching the screen.”

Titled SideSwype, the team will be presenting the development and a related paper on 8th October 2014 at the Association for Computing Machinery’s Symposium on User Interface Software and Technology in Honolulu.







Most Viewed