Sign Language Translation with Sentence Embedding Supervision

Yasser Hamidullah, Josef van Genabith, Cristina España-Bonet


Abstract
State-of-the-art sign language translation (SLT) systems facilitate the learning process through gloss annotations, either in an end2end manner or by involving an intermediate step. Unfortunately, gloss labelled sign language data is usually not available at scale and, when available, gloss annotations widely differ from dataset to dataset. We present a novel approach using sentence embeddings of the target sentences at training time that take the role of glosses. The new kind of supervision does not need any manual annotation but it is learned on raw textual data. As our approach easily facilitates multilinguality, we evaluate it on datasets covering German (PHOENIX-2014T) and American (How2Sign) sign languages and experiment with mono- and multilingual sentence embeddings and translation systems. Our approach significantly outperforms other gloss-free approaches, setting the new state-of-the-art for data sets where glosses are not available and when no additional SLT datasets are used for pretraining, diminishing the gap between gloss-free and gloss-dependent systems.
Anthology ID:
2024.acl-short.40
Volume:
Proceedings of the 62nd Annual Meeting of the Association for Computational Linguistics (Volume 2: Short Papers)
Month:
August
Year:
2024
Address:
Bangkok, Thailand
Editors:
Lun-Wei Ku, Andre Martins, Vivek Srikumar
Venue:
ACL
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
425–434
Language:
URL:
https://aclanthology.org/2024.acl-short.40
DOI:
Bibkey:
Cite (ACL):
Yasser Hamidullah, Josef van Genabith, and Cristina España-Bonet. 2024. Sign Language Translation with Sentence Embedding Supervision. In Proceedings of the 62nd Annual Meeting of the Association for Computational Linguistics (Volume 2: Short Papers), pages 425–434, Bangkok, Thailand. Association for Computational Linguistics.
Cite (Informal):
Sign Language Translation with Sentence Embedding Supervision (Hamidullah et al., ACL 2024)
Copy Citation:
PDF:
https://aclanthology.org/2024.acl-short.40.pdf