Gesture recognition is a non-contact user interface used for interaction in application programs. This recognition method is about body movements and postures and their conversion into usable information. It is about the movement, position, direction and speed of movement of a hand, head or the whole body.
Gesture recognition has only limited things in common with motion control using data gloves. The motion input in computer games such as the game console Wii can also only be compared to gesture recognition to a limited extent because the game sensors, like the sensors of data gloves, are moved by the user depending on direction and speed.
There is informationin every human gesture that needs to be recognized. A head movement can signal approval or disapproval, a hand movement can be interpreted and evaluated in many ways. This information must be selected and used for interaction.
Gesture recognition technology relies on camera- and infrared-basedsystems that evaluate the motion sequence, its position, direction and speed with refined algorithms. One method uses infrared sensors and determines gestures based on the calculated position of an object. In this method, infrared light is reflected from infrared LEDs on the moving hand and detected by infrared detectors. Another method is phase-based and evaluates the temporal course of the signal changes todetermine the direction of movement of an object. The combination of both methods produces satisfactory results.
An alternative method does not rely on optical detection and evaluation, but on electrical field strength changes. In this method, a defined electric field is generated which is measured continuously. Changes in field strength caused by humans and their movement in the electric field are recorded and evaluated. The method can clearly detect movements and their direction at a distance of 10 cm to 15 cm.
The implementation of gesture recognition in practical applications can be seen in the Natural User Interface( NUI), as realized in multi-touch screens.