Now that the smartphone has become our standard day-to-day technological device, man-machine interaction is moving into a new phase. US-based researchers have been testing a new type of sensor that uses the reflection of the phone's wireless signals to sense and interpret nearby human gestures.

Gesture Control: Reflected Smartphone Signals Could Replace Camera Tracking
Over two thirds of the US population owns a smartphone. Meanwhile, gesture-control of electronic devices has become second nature with the use of touchscreens. Now Matt Reynolds and Shwetak Patel, respectively Assistant Professor in the Department of Electrical and Computer Engineering at Duke University, and Associate Professor in the departments of Computer Science and Engineering and Electrical Engineering at the University of Washington, are, like many others, working on new ways in which users could interact with their devices. They have developed an approach they call SideSwipe, which has emerged from their research on converting radio signals into binary code so as to detect and interpret movement. As part of their work, they have developed a new sensor which uses the reflection of the wireless transmissions from the phone to capture movement close by, which means that you will be able to interact with your smartphone without actually holding it, for instance when it is in your pocket or handbag. When your hand moves through space near the phone, your body reflects some of the transmitted signal back toward the phone. By classifying the changes in the reflected signal it is possible to identify the type of gesture being performed, the UW researchers have shown.

Saving precious battery power

These days smartphones already come equipped with a number of sophisticated sensors: camera, accelerometer and gyroscope that can track the phone’s movement. Meanwhile manufacturers are now starting to incorporate 3-D gesture sensing into smartphones, based on its camera. However, cameras consume significant battery power and, in order to capture the user’s gesture precisely, need to have a clear view of his/her hands. The new form of low-power wireless sensing technology used in the Reynolds-Patel SideSwipe represents a much lower drain on your phone battery than using your camera’s video sensor. In addition to having a minimal impact on battery life, the researchers say that their invention is simple to use and users will not be required to learn any special techniques. When you make a call or an app on your phone exchanges data with the Internet, the phone transmits radio signals on a 2G, 3G or 4G cellular network to communicate with a cellular base station. So when your hand moves through space near the phone, your body reflects part of the transmitted signal back towards the phone. The new UW system uses multiple small antennae to capture the changes in the reflected signal and classify the changes to detect the type of gesture performed.  Based on these variations, your phone will be able to learn over time what your various gestures mean. For example swiping from the right or the left could correspond to distinct commands to the phone, such as unlocking your screen or even dialling a stored number.

Going beyond the touchscreen

A smartphone’s wireless transmissions have the great advantage of being able to pass easily through the fabric of clothing or a handbag. In addition, your smartphone can pick up the change in radio waves over some distance, so nearby space could soon become terrain to be explored for smartphone-based man-machine interaction. In fact once sensors can be programmed to reliably capture gestures at distance from the device by analysing changes to the reflected signal, the – already traditional – touch-based approach may soon seem rather out-of-date. So far, a group of ten study participants tested the technology by performing 14 different hand gestures, including hovering, sliding and tapping, in various positions around a smartphone. During the tests, the phone was each time calibrated by learning the user's hand movements and gradually trained itself to respond ‘correctly’, eventually recognising gestures with around 87% accuracy. Nevertheless, if it is to be used reliably, the SideSwipe developers will need to overcome some basic restrictions inherent in the system. While a smartphone will ‘learn’ over time to recognise the range of movements characteristic of its owner, how will it react to other movements – by domestic pets in the vicinity for example?  Meanwhile other smartphones with the same technology installed may well react in a very different way to gesture commands. Still, the attractions of SideSwipe are obvious and the University of Washington team reckon that you will soon be able to use the system to silence a ring tone or change the music playing on your device.
By Simon Guigue