Sequence-to-Sequence Lexical Normalization with Multilingual Transformers

Ana-Maria Bucur, Adrian Cosma, Liviu P. Dinu


Abstract
Current benchmark tasks for natural language processing contain text that is qualitatively different from the text used in informal day to day digital communication. This discrepancy has led to severe performance degradation of state-of-the-art NLP models when fine-tuned on real-world data. One way to resolve this issue is through lexical normalization, which is the process of transforming non-standard text, usually from social media, into a more standardized form. In this work, we propose a sentence-level sequence-to-sequence model based on mBART, which frames the problem as a machine translation problem. As the noisy text is a pervasive problem across languages, not just English, we leverage the multi-lingual pre-training of mBART to fine-tune it to our data. While current approaches mainly operate at the word or subword level, we argue that this approach is straightforward from a technical standpoint and builds upon existing pre-trained transformer networks. Our results show that while word-level, intrinsic, performance evaluation is behind other methods, our model improves performance on extrinsic, downstream tasks through normalization compared to models operating on raw, unprocessed, social media text.
Anthology ID:
2021.wnut-1.53
Volume:
Proceedings of the Seventh Workshop on Noisy User-generated Text (W-NUT 2021)
Month:
November
Year:
2021
Address:
Online
Editors:
Wei Xu, Alan Ritter, Tim Baldwin, Afshin Rahimi
Venue:
WNUT
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
473–482
Language:
URL:
https://aclanthology.org/2021.wnut-1.53
DOI:
10.18653/v1/2021.wnut-1.53
Bibkey:
Cite (ACL):
Ana-Maria Bucur, Adrian Cosma, and Liviu P. Dinu. 2021. Sequence-to-Sequence Lexical Normalization with Multilingual Transformers. In Proceedings of the Seventh Workshop on Noisy User-generated Text (W-NUT 2021), pages 473–482, Online. Association for Computational Linguistics.
Cite (Informal):
Sequence-to-Sequence Lexical Normalization with Multilingual Transformers (Bucur et al., WNUT 2021)
Copy Citation:
PDF:
https://aclanthology.org/2021.wnut-1.53.pdf
Data
OLID