With a new app developed by Swiss researchers, users can now control their smartphones with gestures resembling sign language.
"Many movement-recognition programmes need plenty of processor and memory power. Our new algorithm uses a far smaller portion of computer memory and is ideal for smartphones," explained its developer Otmar Hilliges, professor of computer science from the Swiss Federal Institute of Technology in Zurich (ETHZ).
The app's minimal processing footprint means it could also run on smart watches or in augmented-reality devices like Google Glass.
The app currently recognises six different gestures and executes their corresponding commands.
The programme uses the smartphone's built-in camera to register its environment.
It does not evaluate depth or colour. The information it does register - the shape of the gesture, the parts of the hand - is reduced to a simple outline that is classified according to stored gestures.
The programme then executes the command associated with the gesture it observes.
It also recognises the hand's distance from the camera and warns the user when the hand is either too close or too far away.
Gesture control will not replace touchscreen control but supplement it, Hilliges noted.
"To expand its functionality, we are going to add further classification schemes to the programme", researchers added.
The team presented the app to an audience of industry professionals at a symposium in Honolulu, Hawaii recently.