Sign language recognition. Time- exposure image of a man using a computerised method for recognising sign language, the system of hand gestures used by deaf and mute people for communication. A computer interpretation of the motion of the man's hands appears on a screen. The computer (not seen here) analyses images from a camera (at top right). It maps any hand movements and compares them with stored computer models of the signs in the language. This system may be used to create a human-computer interface similar to oral speech recognition systems. Photographed during "smart room" research by the Media Lab at the Massachusetts Institute of Technology, USA.
Model release available. Property release not required.