Distributionally Robust Recurrent Decoders with Random Network Distillation

Antonio Valerio Miceli Barone, Alexandra Birch, Rico Sennrich


Abstract
Neural machine learning models can successfully model language that is similar to their training distribution, but they are highly susceptible to degradation under distribution shift, which occurs in many practical applications when processing out-of-domain (OOD) text. This has been attributed to “shortcut learning””:" relying on weak correlations over arbitrary large contexts. We propose a method based on OOD detection with Random Network Distillation to allow an autoregressive language model to automatically disregard OOD context during inference, smoothly transitioning towards a less expressive but more robust model as the data becomes more OOD, while retaining its full context capability when operating in-distribution. We apply our method to a GRU architecture, demonstrating improvements on multiple language modeling (LM) datasets.
Anthology ID:
2022.repl4nlp-1.1
Volume:
Proceedings of the 7th Workshop on Representation Learning for NLP
Month:
May
Year:
2022
Address:
Dublin, Ireland
Editors:
Spandana Gella, He He, Bodhisattwa Prasad Majumder, Burcu Can, Eleonora Giunchiglia, Samuel Cahyawijaya, Sewon Min, Maximilian Mozes, Xiang Lorraine Li, Isabelle Augenstein, Anna Rogers, Kyunghyun Cho, Edward Grefenstette, Laura Rimell, Chris Dyer
Venue:
RepL4NLP
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
1–8
Language:
URL:
https://aclanthology.org/2022.repl4nlp-1.1
DOI:
10.18653/v1/2022.repl4nlp-1.1
Bibkey:
Cite (ACL):
Antonio Valerio Miceli Barone, Alexandra Birch, and Rico Sennrich. 2022. Distributionally Robust Recurrent Decoders with Random Network Distillation. In Proceedings of the 7th Workshop on Representation Learning for NLP, pages 1–8, Dublin, Ireland. Association for Computational Linguistics.
Cite (Informal):
Distributionally Robust Recurrent Decoders with Random Network Distillation (Valerio Miceli Barone et al., RepL4NLP 2022)
Copy Citation:
PDF:
https://aclanthology.org/2022.repl4nlp-1.1.pdf
Video:
 https://aclanthology.org/2022.repl4nlp-1.1.mp4
Data
MTNT