Communication barriers between the hearing-impaired, speech-impaired, and non-signers create persistent challenges in education, healthcare, and daily interactions. While Indian Sign Language (ISL) is widely used within the deaf community, the lack of public awareness significantly limits effective communication. This paper presents a real-time ISL Recognition System developed using deep learning and computer vision. The system captures hand gestures through a webcam, extracts 21 hand landmarks using Google Mediapipe, and classifies them using a trained Convolutional Neural Network (CNN). The recognized gestures are translated into text and speech for better accessibility. In addition to gesture recognition, the system provides speech-to-text functionality to enable two-way communication between hearing and non-hearing users. The proposed solution is lightweight, cost-effective, and easy to deploy, making it suitable for educational institutions, assistive technologies, and inclusive digital platforms. The results demonstrate high accuracy and robust performance, validating the effectiveness of the proposed framework.
Indian Sign Language, Deep Learning, Gesture Recognition, CNN, Mediapipe.
IRE Journals:
A. R. Pragna, Akki Likitha, Hema Sree, Kavya. K, Sheela. B. P; Dr. B. Sreepathi "Indian Sign Language Recognition System" Iconic Research And Engineering Journals Volume 9 Issue 6 2025 Page 1733-1736 https://doi.org/10.64388/IREV9I6-1712873
IEEE:
A. R. Pragna, Akki Likitha, Hema Sree, Kavya. K, Sheela. B. P; Dr. B. Sreepathi
"Indian Sign Language Recognition System" Iconic Research And Engineering Journals, 9(6) https://doi.org/10.64388/IREV9I6-1712873