Applying the Transformer to Character-level Transduction

Shijie Wu, Ryan Cotterell, Mans Hulden


Abstract
The transformer has been shown to outperform recurrent neural network-based sequence-to-sequence models in various word-level NLP tasks. Yet for character-level transduction tasks, e.g. morphological inflection generation and historical text normalization, there are few works that outperform recurrent models using the transformer. In an empirical study, we uncover that, in contrast to recurrent sequence-to-sequence models, the batch size plays a crucial role in the performance of the transformer on character-level tasks, and we show that with a large enough batch size, the transformer does indeed outperform recurrent models. We also introduce a simple technique to handle feature-guided character-level transduction that further improves performance. With these insights, we achieve state-of-the-art performance on morphological inflection and historical text normalization. We also show that the transformer outperforms a strong baseline on two other character-level transduction tasks: grapheme-to-phoneme conversion and transliteration.
Anthology ID:
2021.eacl-main.163
Volume:
Proceedings of the 16th Conference of the European Chapter of the Association for Computational Linguistics: Main Volume
Month:
April
Year:
2021
Address:
Online
Editors:
Paola Merlo, Jorg Tiedemann, Reut Tsarfaty
Venue:
EACL
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
1901–1907
Language:
URL:
https://aclanthology.org/2021.eacl-main.163
DOI:
10.18653/v1/2021.eacl-main.163
Award:
 Honorable Mention for Best Short Paper
Bibkey:
Cite (ACL):
Shijie Wu, Ryan Cotterell, and Mans Hulden. 2021. Applying the Transformer to Character-level Transduction. In Proceedings of the 16th Conference of the European Chapter of the Association for Computational Linguistics: Main Volume, pages 1901–1907, Online. Association for Computational Linguistics.
Cite (Informal):
Applying the Transformer to Character-level Transduction (Wu et al., EACL 2021)
Copy Citation:
PDF:
https://aclanthology.org/2021.eacl-main.163.pdf
Code
 shijie-wu/neural-transducer +  additional community code