Individuals with visual impairments face challenges accessing written information. Braille is one of many solutions that only requires the reader to sense the depth of the paper with their hand to comprehend written information without the need to look at the text. However, learning Braille, especially for those losing sight later in life, presents difficulties. This research introduces the BrailleSense system, a technological solution designed to assist visually impaired individuals in learning and utilizing the Braille system effectively. The system features a virtual prototype of hand gloves equipped with a camera, aiming to alleviate challenges associated with Braille pattern memorization. Key contributions include the development of a custom lightweight Convolutional Neural Network (CNN) model for Braille pattern classification coined as the BrailleNet. This model is then deployed on a Raspberry Pi to investigate the feasibility of working with resource-limited portable devices, BrailleNet achieves an impressive accuracy of 97.44% under real-world constraints. The research outlines the conceptual design through a 3D model of the gloves, addressing spatial allocation. Acknowledging challenges in user comfort and alignment, BrailleSense presents a pioneering step towards empowering visually impaired individuals, enhancing literacy, and fostering independence. The dataset and code for the BrailleSense system are available on GitHub - https://github.com/faiyazabdullah/BrailleNet
Mr. M. Srinivasa ReddyC. S. Chidan KumarG. SamathaK. Sripal Reddy
Tasleem KausarSajjad ManzoorAdeeba KausarYun LuWasif MuhammadMuhammad Adnan Ashraf
Sabitha Rani B SRaji Gopinathan.NM HarishankarElizabeth Sherly
Md. Akif HussainRiazul Islam RifatSyed Bayes IqbalSimon BiswasMd. Golam Rabiul AlamMd Tanzim Reza
Surekha JanraoTavion FernandesOjas GolatkarSwaraj Dusane