Adiga, SushmithaJyothi, S.Umadevi, R.Snehalatha, G. D.Surbhi, C.
Sign language improves communication between hearing-impaired people, and that is the only way to communicate with non-signers. American Sign Language is known for its efficiency in improving communication between hearing-impaired people. This creative application uses technology to overcome the communication gap between non-signers and those with hearing impairments, fostering inclusivity and understanding by creating an application that detects the signs and generates natural sentences using those sequence of words. By utilizing MobileNetV2 for sign language gesture detection and incorporating Natural Language Processing (NLP) techniques for sentence generation, the proposed project not only recognizes the visual expressions of American Sign Language but also converts them into coherent English sentences. This seamless integration of computer vision and language processing technologies holds the potential to expand communication for hearing-impaired and deaf-mute people, providing them with a more effective means to interact with the broader community.
Adiga, SushmithaJyothi, S.Umadevi, R.Snehalatha, G. D.Surbhi, C.
Dr.P. Golda JeyasheeliN. Indumathi
D. VinodhaC. Manjunatha SwamyJ. JenefaRakoth Kandan SambandamV. Divya
Shagun GuptaRiya ThakurVinay MaheshwariNamita Pulgam