This paper presents a novel and real-time system for human-computer interaction with hand gestures at a distance. The system follows the principle of multi-touch techniques but is touch-free and only needs an ordinary computer screen, while common implementation of multi-touch demands creating special hardware platforms. We use Microsoft's Kinect sensor as the input device and take advantage of its depth information to help segment human body from the cluttered background. Skin color is also utilized in the process of segmentation. Besides, our system includes hand identification and tracking, as well as gesture recognition. The k-nearest neighbors (KNN) algorithm is chosen to solve the problem of object tracking. Then the tracked blob information is transmitted to applications by Tangible User Interface Object (TUIO) protocol. Thus a natural user interface is realized which is characterized by non-contact and barehanded control.
Kawther SmariMed Salim Bouhlel
Kawther SmariMed Salim Bouhlel
Harrison CookQuang Vinh NguyễnSimeon Simoff
Arun K. KulshreshthChris ZornJoseph J. LaViola
Baoliang WangZeyu ChenJing Chen