M NaveenkumarS SritharG Ramesh KalyanE VetrimaniS. Alagumuthukrishnan
Sign language is the most complex language to understand by the end-user without knowing the meaning of the sign and it depends on the special gesture motion. The gesture marks are characterized by hands with aid by facial appearance and body position. In this article, Gesture recognition is proposed for static sign language using Deep Learning with image processing. The involvement contains dual solutions to the problem. One is resized with Bicubic static American Sign Language binary images. Besides that, good recognition results in of detection the borderline hand using the various edge detection techniques. Another solution is to classify the alpha characters of sign language using a Convolution Neural Network (CNN). Owing to the anticipated method, test accuracy of 96% and F1-Score of 99% have achieved for 10 different hand gesture classifications.
Yohanssen PratamaEster MarbunYonatan ParapatAnastasya Manullang
Sohrab HossainDhiman SarmaTanni MittraMohammad Nazmul AlamIshita SahaFatema Tuj Johora
Prangon DasTanvir AhmedMd. Firoj Ali
Bekalu Tadele AbejeAyodeji Olalekan SalauAbreham Debasu MengistuNigus Kefyalew Tamiru
Mohammad Anwar HossenArun GovindaiahSadia SultanaAlauddin Bhuiyan