Local cover image
Local cover image
Image from OpenLibrary
Custom cover image
Custom cover image

E-salin: development of a tagalog fsl-to-text and speech-to-text- and-fsl images in real-time web-based filipino sign language and neural machine translation systems for selected 3 major philippine languages: cebuano, ilocano, and waray/ Angelo SJ. Avanceña, Rose Belle G. Bolor, Sophia B. Espiritu, Annabela T. Ignacio, Eruel Andrei O. Marasigan, and Aaron B. Mendoza.--

By: Contributor(s): Material type: TextTextPublication details: Manila: Technological University of the Philippines, 2025.Description: xvii, 272pages: 29cmContent type:
Media type:
Carrier type:
Subject(s): LOC classification:
  • BTH TK 870 A93 2025
Dissertation note: College Of Engineering.-- Bachelor of science in electronics engineering: Technological University of the Philippines, 2025. Summary: e-SALIN is a real-time, web-based translation system designed to bridge the communication gap between the deaf and hearing communities in the Philippines. While many deaf individuals primarily use Filipino Sign Language (FSL), limited understanding among the hearing often leads to miscommunication and exclusion. e-SALIN addresses this by translating FSL into Tagalog text and into three widely spoken Philippine languages: Cebuano, Ilocano, and Waray. It promotes inclusive and respectful communication through gender-fair and culturally sensitive language. Powered by machine learning and computer vision; including CNN and LSTM models. The system accurately recognizes and translates signs in real time, achieving over 94% accuracy within three seconds. It also uses neural machine translation (NMT) models like marianMT and mBART, trained on a dataset of 507 Tagalog words translated into regional languages. Despite working with low-resource languages, it maintains high translation quality, with BLEU scores ranging from 86.9% to 98.8%, highlighting e-SALIN’s strong potential to support inclusive and multilingual communication. A dataset of 507 Tagalog words translated into Cebuano, Ilocano, and Waray using Google Colab Pro + was used to train the marianMT and mBART models. BLEU ratings showed that even with low-resource languages, translation quality varied from 86.9% to 98.8%, indicating good accuracy. These results highlight the system's strong multilingual translation and sign recognition capabilities.
Tags from this library: No tags from this library for this title. Log in to add tags.
Star ratings
    Average rating: 0.0 (0 votes)
Holdings
Item type Current library Shelving location Call number Copy number Status Date due Barcode
Bachelor's Thesis COE Bachelor's Thesis COE TUP Manila Library Thesis Section-2nd floor BTH TK 870 A93 2025 (Browse shelf(Opens below)) c.1 Not for loan BTH0006470

Bachelor's thesis

College Of Engineering.--
Bachelor of science in electronics engineering: Technological University of the Philippines,
2025.

Includes bibliographic references and index.

e-SALIN is a real-time, web-based translation system designed to bridge the
communication gap between the deaf and hearing communities in the Philippines. While
many deaf individuals primarily use Filipino Sign Language (FSL), limited understanding
among the hearing often leads to miscommunication and exclusion. e-SALIN addresses
this by translating FSL into Tagalog text and into three widely spoken Philippine languages:
Cebuano, Ilocano, and Waray. It promotes inclusive and respectful communication through
gender-fair and culturally sensitive language. Powered by machine learning and computer
vision; including CNN and LSTM models. The system accurately recognizes and translates
signs in real time, achieving over 94% accuracy within three seconds. It also uses neural
machine translation (NMT) models like marianMT and mBART, trained on a dataset of
507 Tagalog words translated into regional languages. Despite working with low-resource
languages, it maintains high translation quality, with BLEU scores ranging from 86.9% to
98.8%, highlighting e-SALIN’s strong potential to support inclusive and multilingual
communication. A dataset of 507 Tagalog words translated into Cebuano, Ilocano, and
Waray using Google Colab Pro + was used to train the marianMT and mBART models.
BLEU ratings showed that even with low-resource languages, translation quality varied
from 86.9% to 98.8%, indicating good accuracy. These results highlight the system's strong
multilingual translation and sign recognition capabilities.

There are no comments on this title.

to post a comment.

Click on an image to view it in the image viewer

Local cover image



© 2025 Technological University of the Philippines.
All Rights Reserved.

Powered by Koha