Learning PyTorch Through A Neural Dependency Parsing Exercise

David Jurgens


Abstract
Dependency parsing is increasingly the popular parsing formalism in practice. This assignment provides a practice exercise in implementing the shift-reduce dependency parser of Chen and Manning (2014). This parser is a two-layer feed-forward neural network, which students implement in PyTorch, providing practice in developing deep learning models and exposure to developing parser models.
Anthology ID:
2021.teachingnlp-1.10
Volume:
Proceedings of the Fifth Workshop on Teaching NLP
Month:
June
Year:
2021
Address:
Online
Editors:
David Jurgens, Varada Kolhatkar, Lucy Li, Margot Mieskes, Ted Pedersen
Venue:
TeachingNLP
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
62–64
Language:
URL:
https://aclanthology.org/2021.teachingnlp-1.10
DOI:
10.18653/v1/2021.teachingnlp-1.10
Bibkey:
Cite (ACL):
David Jurgens. 2021. Learning PyTorch Through A Neural Dependency Parsing Exercise. In Proceedings of the Fifth Workshop on Teaching NLP, pages 62–64, Online. Association for Computational Linguistics.
Cite (Informal):
Learning PyTorch Through A Neural Dependency Parsing Exercise (Jurgens, TeachingNLP 2021)
Copy Citation:
PDF:
https://aclanthology.org/2021.teachingnlp-1.10.pdf