Lexicon Learning for Few Shot Sequence Modeling

Ekin Akyurek, Jacob Andreas


Abstract
Sequence-to-sequence transduction is the core problem in language processing applications as diverse as semantic parsing, machine translation, and instruction following. The neural network models that provide the dominant solution to these problems are brittle, especially in low-resource settings: they fail to generalize correctly or systematically from small datasets. Past work has shown that many failures of systematic generalization arise from neural models’ inability to disentangle lexical phenomena from syntactic ones. To address this, we augment neural decoders with a lexical translation mechanism that generalizes existing copy mechanisms to incorporate learned, decontextualized, token-level translation rules. We describe how to initialize this mechanism using a variety of lexicon learning algorithms, and show that it improves systematic generalization on a diverse set of sequence modeling tasks drawn from cognitive science, formal semantics, and machine translation.
Anthology ID:
2021.acl-long.382
Volume:
Proceedings of the 59th Annual Meeting of the Association for Computational Linguistics and the 11th International Joint Conference on Natural Language Processing (Volume 1: Long Papers)
Month:
August
Year:
2021
Address:
Online
Editors:
Chengqing Zong, Fei Xia, Wenjie Li, Roberto Navigli
Venues:
ACL | IJCNLP
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
4934–4946
Language:
URL:
https://aclanthology.org/2021.acl-long.382
DOI:
10.18653/v1/2021.acl-long.382
Bibkey:
Cite (ACL):
Ekin Akyurek and Jacob Andreas. 2021. Lexicon Learning for Few Shot Sequence Modeling. In Proceedings of the 59th Annual Meeting of the Association for Computational Linguistics and the 11th International Joint Conference on Natural Language Processing (Volume 1: Long Papers), pages 4934–4946, Online. Association for Computational Linguistics.
Cite (Informal):
Lexicon Learning for Few Shot Sequence Modeling (Akyurek & Andreas, ACL-IJCNLP 2021)
Copy Citation:
PDF:
https://aclanthology.org/2021.acl-long.382.pdf
Video:
 https://aclanthology.org/2021.acl-long.382.mp4
Code
 ekinakyurek/lexical
Data
SCAN