Dim Wihl Gat Tun: The Case for Linguistic Expertise in NLP for Under-Documented Languages

Clarissa Forbes, Farhan Samir, Bruce Oliver, Changbing Yang, Edith Coates, Garrett Nicolai, Miikka Silfverberg


Abstract
Recent progress in NLP is driven by pretrained models leveraging massive datasets and has predominantly benefited the world’s political and economic superpowers. Technologically underserved languages are left behind because they lack such resources. Hundreds of underserved languages, nevertheless, have available data sources in the form of interlinear glossed text (IGT) from language documentation efforts. IGT remains underutilized in NLP work, perhaps because its annotations are only semi-structured and often language-specific. With this paper, we make the case that IGT data can be leveraged successfully provided that target language expertise is available. We specifically advocate for collaboration with documentary linguists. Our paper provides a roadmap for successful projects utilizing IGT data: (1) It is essential to define which NLP tasks can be accomplished with the given IGT data and how these will benefit the speech community. (2) Great care and target language expertise is required when converting the data into structured formats commonly employed in NLP. (3) Task-specific and user-specific evaluation can help to ascertain that the tools which are created benefit the target language speech community. We illustrate each step through a case study on developing a morphological reinflection system for the Tsimchianic language Gitksan.
Anthology ID:
2022.findings-acl.167
Volume:
Findings of the Association for Computational Linguistics: ACL 2022
Month:
May
Year:
2022
Address:
Dublin, Ireland
Editors:
Smaranda Muresan, Preslav Nakov, Aline Villavicencio
Venue:
Findings
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
2116–2130
Language:
URL:
https://aclanthology.org/2022.findings-acl.167
DOI:
10.18653/v1/2022.findings-acl.167
Bibkey:
Cite (ACL):
Clarissa Forbes, Farhan Samir, Bruce Oliver, Changbing Yang, Edith Coates, Garrett Nicolai, and Miikka Silfverberg. 2022. Dim Wihl Gat Tun: The Case for Linguistic Expertise in NLP for Under-Documented Languages. In Findings of the Association for Computational Linguistics: ACL 2022, pages 2116–2130, Dublin, Ireland. Association for Computational Linguistics.
Cite (Informal):
Dim Wihl Gat Tun: The Case for Linguistic Expertise in NLP for Under-Documented Languages (Forbes et al., Findings 2022)
Copy Citation:
PDF:
https://aclanthology.org/2022.findings-acl.167.pdf
Video:
 https://aclanthology.org/2022.findings-acl.167.mp4