Sohail SohailMuhammad Yaseen KhanNafeesunisa SiddiquiKiran Puttegowda
Abstract Communication is the foremost and basic necessity of a person. The ability to speak without any problem is a boon; nevertheless, woefully we have often seen people around us who are the victim of speech or hearing impairments. Thus, the ‘Sign Language’ (SL) appears as an alternate standard language that is globally understood and adopted by the deaf community. Though SL is a communication tool, but in practice, we still see most of the normal people do not understand the SL properly; thus, it is again a problem in communication between a speech/hearing impaired and normal persons. In this regard, many research attempts have been made to resolve this problem via wearable technology and other different paradigms of computational solutions. However, almost all of them focused on English or western languages, which provide a trivial resolution for the people of the Indian subcontinent. Thus, in this paper, we propose a solution to address this issue by employing Kinect Motion Sensor for detecting the signs (in SL) through hand movements and gestures and translating them in Urdu/Hindi language, which is readable and audible to normal persons. The proposed system is developed on a feature-based model; hence, we stored signs in a dictionary/training set for the development of the baseline system; however, the users can customize the system by adding their sign patterns in the future.
Simran KaurAkshit GuptaAshutosh AggarwalDeepak GuptaAshish Khanna
Saad AhmedHasnain ShafiqYamna RaheelNoor ChishtiSyed Muhammad Asad
Saad AhmedHasnain ShafiqYamna RaheelNoor ChishtiSyed Muhammad Asad
Salihu AliyuMohamed MohandesMohamed DericheS.M. Badran