Only 2.78 percent of Americans are unable to communicate effectively in the language most often used in daily life.The fundamental tool required for interacting with another human being is language.Verbal interchange has been the primary mode of human communication from the beginning of our species.While it's undeniable that technological progress has improved the quality of life for the vast majority of people, there will always be others with less who face greater challenges in breaking down barriers to interaction as a consequence.Sign language is based on hand movements and is used by those who have trouble speaking.Those who are deaf or visually impaired may now have meaningful conversations with their sighted peers thanks to this effort.The results of this study suggest that people with normal hearing and speech could benefit from using a customised prototype of a Feed Forward Neural Network (FFNN) that is equipped with the capability to automatically recognise sign language to improve their communication with people who have hearing or speech impairments.The hand signal was recognised by the system thanks to the feed-forward neural network feature point extraction.This method used a feed-forward neural network to analyse the feature points extracted from the hand signals and identify them.Technology that combines hand gesture recognition with speech processing and employs HMM (which stands for "hidden Markov model") allows the dump people and the normal people to interact with one another.
P. IlanchezhianIshanvi SinghM. BalajiA. Manoj KumarS. Muhamad Yaseen
M. K. NaikK. KhatoonAkkala KrishnaYerasi ReddyYerraguntla PrakashNallabothula Pavan
Neetu SinglaVinayak ChoubeySwarnima Rai