Constructive Type-Logical Supertagging With Self-Attention Networks

Konstantinos Kogkalidis, Michael Moortgat, Tejaswini Deoskar


Abstract
We propose a novel application of self-attention networks towards grammar induction. We present an attention-based supertagger for a refined type-logical grammar, trained on constructing types inductively. In addition to achieving a high overall type accuracy, our model is able to learn the syntax of the grammar’s type system along with its denotational semantics. This lifts the closed world assumption commonly made by lexicalized grammar supertaggers, greatly enhancing its generalization potential. This is evidenced both by its adequate accuracy over sparse word types and its ability to correctly construct complex types never seen during training, which, to the best of our knowledge, was as of yet unaccomplished.
Anthology ID:
W19-4314
Volume:
Proceedings of the 4th Workshop on Representation Learning for NLP (RepL4NLP-2019)
Month:
August
Year:
2019
Address:
Florence, Italy
Editors:
Isabelle Augenstein, Spandana Gella, Sebastian Ruder, Katharina Kann, Burcu Can, Johannes Welbl, Alexis Conneau, Xiang Ren, Marek Rei
Venue:
RepL4NLP
SIG:
SIGREP
Publisher:
Association for Computational Linguistics
Note:
Pages:
113–123
Language:
URL:
https://aclanthology.org/W19-4314
DOI:
10.18653/v1/W19-4314
Bibkey:
Cite (ACL):
Konstantinos Kogkalidis, Michael Moortgat, and Tejaswini Deoskar. 2019. Constructive Type-Logical Supertagging With Self-Attention Networks. In Proceedings of the 4th Workshop on Representation Learning for NLP (RepL4NLP-2019), pages 113–123, Florence, Italy. Association for Computational Linguistics.
Cite (Informal):
Constructive Type-Logical Supertagging With Self-Attention Networks (Kogkalidis et al., RepL4NLP 2019)
Copy Citation:
PDF:
https://aclanthology.org/W19-4314.pdf
Code
 konstantinosKokos/Lassy-TLG-Supertagging
Data
aethel