Gesture recognition is gaining attention as an attractive feature for the development of ubiquitous, context-aware, IoT applications. Use of radars as a primary or secondary system is tempting, as they can operate in darkness, high light intensity environments, and longer distances than many competitor systems. Starting from this observation, we present a generic, low-cost, mm-wave radar-based gesture recognition system. Among potential benefits of mm-wave radars are a high spatial resolution due to small wavelength, the availability of multiple antennas in a small area and the low interference due to the natural attenuation of mm-wave radiation. We experimentally evaluate our COTS solution considering eight different gestures and using two low-complexity classification algorithms: the unsupervised Self Organized Map (SOM) and the supervised Learning Vector Quantization (LVQ). To test robustness, we consider gestures performed by a human hand and a human body, at short and long distance. From our preliminary evaluations, we observe that LVQ and SOM correctly detect 75% and 60% of all gestures, respectively, from the raw, unprocessed data. The detection rate is significantly higher (>90%) for selected gesture groups. We argue that performance suffers due to inaccurate AoA estimation. Accordingly, we evaluate our system employing a two-radar setup that increases the estimation accuracy by 8-9%.
Xiaochao DangWenze KeZhanjun HaoPeng JinHan DengYing Sheng
Yuqing SongLongwen WuYaqin ZhaoPuqiu LiuRuchen LvHikmat Ullah
Wen JiangYihui RenYing LiuZiao WangXinghua Wang
Wei LiJiahao JiangDanian LiuYang GaoQi Li