Rui Chaves
2020
What Don’t RNN Language Models Learn About Filler-Gap Dependencies?
Rui Chaves
Proceedings of the Society for Computation in Linguistics 2020
Assessing the ability of Transformer-based Neural Models to represent structurally unbounded dependencies
Jillian Da Costa
|
Rui Chaves
Proceedings of the Society for Computation in Linguistics 2020