Researchers have developed a system that could make gestural interfaces much more practical. The hardware for the new gesture-based computing system consists of nothing more than an ordinary webcam and a pair of brightly coloured Lycra gloves.
The system was developed by Robert Wang, a graduate student in the Computer Science and Artificial Intelligence Laboratory (CSAIL), together with Jovan Popoviæ, an associate professor of electrical engineering and computer science.
Wang and Popoviæ’s system can translate gestures made with a gloved hand into the corresponding gestures of a 3D model of the hand on screen, with almost no lag time.
In this video, Wang demonstrates the speed and precision with which the system can gauge hand position in three dimensions — including the flexing of individual fingers — as well as a possible application in mechanical engineering.
* News: Researchers make gesture-based computing interfaces more accessible