Michael TornowAyoub Al-HamadiVinzenz Borrmann
Mobile robots can assist humans in disaster management or environmental perception by building a multi robot team, working as a distributed sensor actor system. In order to coordinate the operations of a multi robot team the human machine interface is required to decode orders which will potentially be performed by a different robot, e.g. pointing to an area to be scanned. The human operator sets his statement in such cases of via hand action as interaction modality, registered by camera, which is the base for feature extraction. In this paper we address a gesture and hand posture based HMI-system. For segmentation of hand regions we combined color and depths information. As feature vectors a varying combination of: Fourier descriptors, cosine descriptors, Hu-moments and geometric features are extracted from the image and depth data. For classification of hand postures the feature vector is processed by an artificial neural network. A maximum overall classification rate of 93% is achieved for single image processing. Stabilizing the hand shape classification for online-sequences using a time histogram enables a robust robot control. The HMI serves hereby as communication basis for a multi-robot based enviroment perception and disaster management.
Jiaxin MaYu ZhangYunjun NamAndrzej CichockiFumitoshi Matsuno
Aditya PatelJ. O. RamsayMohammad ImtiazYufeng Lu
Rendyansyah RendyansyahHera HikmarikaMurat SarıMohd Amiruddin FikriIchlasul Akmali Rizky