%0 Journal Article %T Adaptive Semiparametric Language Models %A Yogatama, Dani %A de Masson d’Autume, Cyprien %A Kong, Lingpeng %J Transactions of the Association for Computational Linguistics %D 2021 %V 9 %I MIT Press %C Cambridge, MA %F yogatama-etal-2021-adaptive %X We present a language model that combines a large parametric neural network (i.e., a transformer) with a non-parametric episodic memory component in an integrated architecture. Our model uses extended short-term context by caching local hidden states—similar to transformer-XL—and global long-term memory by retrieving a set of nearest neighbor tokens at each timestep. We design a gating function to adaptively combine multiple information sources to make a prediction. This mechanism allows the model to use either local context, short-term memory, or long-term memory (or any combination of them) on an ad hoc basis depending on the context. Experiments on word-based and character-based language modeling datasets demonstrate the efficacy of our proposed method compared to strong baselines. %R 10.1162/tacl_a_00371 %U https://aclanthology.org/2021.tacl-1.22 %U https://doi.org/10.1162/tacl_a_00371 %P 362-373