Rangel DaroyaDaryl PeraltaProspero C. Naval
Sign language is very important for people who have impaired hearing and speaking inabilities. In this work, we present a method to classify RGB images of static letter hand poses in Sign Language using a Convolutional Neural Netowrk (CNN) inspired by Densely Connected Convolutional Neural Networks (DenseNet). It was further implemented to classify sign languages in real time using a web camera. DenseNet has been widely used for classification tasks due to the advantages it introduces such as alleviating the vanishing gradient - a common problem encountered with deep networks. Since a deep network is proposed to be used for our sign language classification task, this characteristic is useful. Our proposed network was able to achieve an accuracy of 90.3 % which is comparable to other works including those that used depth images in addition to RGB images. Our network was also able to achieve prediction rates of 50 to 100 Hz which makes it capable of real-time prediction.
Kruti DangarwalaDilendra Hiran
Yoga AgustiansyahDede Kurniadi
Rafiqul GaniTjokorda Agung Budi Wirayuda
Vraj PatelNisarg PanchalHarsh ParikhAnkit K. Sharma
Sai Myo HtetBawin AyeMyo Min Hein