%0 Conference Proceedings %T Sequential Neural Networks as Automata %A Merrill, William %Y Eisner, Jason %Y Gallé, Matthias %Y Heinz, Jeffrey %Y Quattoni, Ariadna %Y Rabusseau, Guillaume %S Proceedings of the Workshop on Deep Learning and Formal Languages: Building Bridges %D 2019 %8 August %I Association for Computational Linguistics %C Florence %F merrill-2019-sequential %X This work attempts to explain the types of computation that neural networks can perform by relating them to automata. We first define what it means for a real-time network with bounded precision to accept a language. A measure of network memory follows from this definition. We then characterize the classes of languages acceptable by various recurrent networks, attention, and convolutional networks. We find that LSTMs function like counter machines and relate convolutional networks to the subregular hierarchy. Overall, this work attempts to increase our understanding and ability to interpret neural networks through the lens of theory. These theoretical insights help explain neural computation, as well as the relationship between neural networks and natural language grammar. %R 10.18653/v1/W19-3901 %U https://aclanthology.org/W19-3901 %U https://doi.org/10.18653/v1/W19-3901 %P 1-13