Towards Privacy-Aware Sign Language Translation at Scale

Phillip Rust, Bowen Shi, Skyler Wang, Necati Cihan Camgoz, Jean Maillard


Abstract
A major impediment to the advancement of sign language translation (SLT) is data scarcity. Much of the sign language data currently available on the web cannot be used for training supervised models due to the lack of aligned captions. Furthermore, scaling SLT using large-scale web-scraped datasets bears privacy risks due to the presence of biometric information, which the responsible development of SLT technologies should account for. In this work, we propose a two-stage framework for privacy-aware SLT at scale that addresses both of these issues. We introduce SSVP-SLT, which leverages self-supervised video pretraining on anonymized and unannotated videos, followed by supervised SLT finetuning on a curated parallel dataset. SSVP-SLT achieves state-of-the-art finetuned and zero-shot gloss-free SLT performance on the How2Sign dataset, outperforming the strongest respective baselines by over 3 BLEU-4. Based on controlled experiments, we further discuss the advantages and limitations of self-supervised pretraining and anonymization via facial obfuscation for SLT.
Anthology ID:
2024.acl-long.467
Volume:
Proceedings of the 62nd Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers)
Month:
August
Year:
2024
Address:
Bangkok, Thailand
Editors:
Lun-Wei Ku, Andre Martins, Vivek Srikumar
Venue:
ACL
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
8624–8641
Language:
URL:
https://aclanthology.org/2024.acl-long.467
DOI:
Bibkey:
Cite (ACL):
Phillip Rust, Bowen Shi, Skyler Wang, Necati Cihan Camgoz, and Jean Maillard. 2024. Towards Privacy-Aware Sign Language Translation at Scale. In Proceedings of the 62nd Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers), pages 8624–8641, Bangkok, Thailand. Association for Computational Linguistics.
Cite (Informal):
Towards Privacy-Aware Sign Language Translation at Scale (Rust et al., ACL 2024)
Copy Citation:
PDF:
https://aclanthology.org/2024.acl-long.467.pdf