What Don’t RNN Language Models Learn About Filler-Gap Dependencies?

Rui Chaves


Anthology ID:
2020.scil-1.1
Volume:
Proceedings of the Society for Computation in Linguistics 2020
Month:
January
Year:
2020
Address:
New York, New York
Editors:
Allyson Ettinger, Gaja Jarosz, Joe Pater
Venue:
SCiL
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
1–11
Language:
URL:
https://aclanthology.org/2020.scil-1.1
DOI:
Bibkey:
Cite (ACL):
Rui Chaves. 2020. What Don’t RNN Language Models Learn About Filler-Gap Dependencies?. In Proceedings of the Society for Computation in Linguistics 2020, pages 1–11, New York, New York. Association for Computational Linguistics.
Cite (Informal):
What Don’t RNN Language Models Learn About Filler-Gap Dependencies? (Chaves, SCiL 2020)
Copy Citation:
PDF:
https://aclanthology.org/2020.scil-1.1.pdf