Semi-Supervised Spoken Language Glossification

Huijie Yao, Wengang Zhou, Hao Zhou, Houqiang Li


Abstract
Spoken language glossification (SLG) aims to translate the spoken language text into the sign language gloss, i.e., a written record of sign language. In this work, we present a framework named Semi-Supervised Spoken Language Glossification (S3LG) for SLG. To tackle the bottleneck of limited parallel data in SLG, our S3LG incorporates large-scale monolingual spoken language text into SLG training. The proposed framework follows the self-training structure that iteratively annotates and learns from pseudo labels. Considering the lexical similarity and syntactic difference between sign language and spoken language, our S3LG adopts both the rule-based heuristic and model-based approach for auto-annotation. During training, we randomly mix these complementary synthetic datasets and mark their differences with a special token. As the synthetic data may be less quality, the S3LG further leverages consistency regularization to reduce the negative impact of noise in the synthetic data. Extensive experiments are conducted on public benchmarks to demonstrate the effectiveness of the S3LG. Our code is available at https://github.com/yaohj11/S3LG.
Anthology ID:
2024.acl-long.504
Volume:
Proceedings of the 62nd Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers)
Month:
August
Year:
2024
Address:
Bangkok, Thailand
Editors:
Lun-Wei Ku, Andre Martins, Vivek Srikumar
Venue:
ACL
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
9300–9312
Language:
URL:
https://aclanthology.org/2024.acl-long.504
DOI:
Bibkey:
Cite (ACL):
Huijie Yao, Wengang Zhou, Hao Zhou, and Houqiang Li. 2024. Semi-Supervised Spoken Language Glossification. In Proceedings of the 62nd Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers), pages 9300–9312, Bangkok, Thailand. Association for Computational Linguistics.
Cite (Informal):
Semi-Supervised Spoken Language Glossification (Yao et al., ACL 2024)
Copy Citation:
PDF:
https://aclanthology.org/2024.acl-long.504.pdf