%0 Journal Article %T Neural Lattice Language Models %A Buckman, Jacob %A Neubig, Graham %J Transactions of the Association for Computational Linguistics %D 2018 %V 6 %I MIT Press %C Cambridge, MA %F buckman-neubig-2018-neural %X In this work, we propose a new language modeling paradigm that has the ability to perform both prediction and moderation of information flow at multiple granularities: neural lattice language models. These models construct a lattice of possible paths through a sentence and marginalize across this lattice to calculate sequence probabilities or optimize parameters. This approach allows us to seamlessly incorporate linguistic intuitions — including polysemy and the existence of multiword lexical items — into our language model. Experiments on multiple language modeling tasks show that English neural lattice language models that utilize polysemous embeddings are able to improve perplexity by 9.95% relative to a word-level baseline, and that a Chinese model that handles multi-character tokens is able to improve perplexity by 20.94% relative to a character-level baseline. %R 10.1162/tacl_a_00036 %U https://aclanthology.org/Q18-1036 %U https://doi.org/10.1162/tacl_a_00036 %P 529-541