Development of sign language learning system/
Caryl Ann O. Eusoya, Justin Lloyd P. Magat, Raffy A. Morada, Jecele A. Sombese, and Danica C. Valera .--
- Manila: Technological University of the Philippines, 2024.
- x, 156pages: 29cm. +1 CD-ROM (4 3/4in.)
Thesis (undergraduate)
College of Industrial Technology.--
Includes bibliography:
This thesis presents the development of a sign language learning system to improve American Sign Language (ASL) education for both hearing and non-hearing individuals in the Philippines. The barriers to deaf education and difficulties in accessing ASL instruction motivated this research. The system was designed using Python, TypeScript, Tailwind, Next.js, Flask, LSTM neural networks, and MediaPipe for motion tracking. It utilizes a camera for gesture recognition, a monitor for interactive video lessons, and assessments to track progress. Testing was conducted with hearing and non-hearing students, as well as experts, using ISO 25010 quality standards. Results showed 88.89% passed test cases in the second iteration, indicating high system reliability. Overall effectiveness evaluations revealed excellent ease of use, quality of output, accuracy, and performance. Twenty four (24) respondents evaluated the project using Likert’s scale and the descriptive interpretation of the mean for functionality (x̄= 4.63), usability (x̄= 4.73), efficiency (x̄= 4.69), credibility (x̄= 4.48), and cost-effectiveness (x̄= 4.94), with an over all mean of 4.69 that indicates a descriptive rating of strongly agree. The learning system enables an immersive ASL education experience tailored to beginner and intermediate skills. It facilitates inclusive learning spaces while promoting awareness. This research successfully demonstrates the potential for motion-based technologies to transform sign language pedagogy through adaptive, engaging multi-media tools. The system can be expanded through advanced vocabulary, optimized tracking, and personalized assessments to spread accessible ASL literacy.