Effective Use of Transformer Networks for Entity Tracking

Aditya Gupta, Greg Durrett


Abstract
Tracking entities in procedural language requires understanding the transformations arising from actions on entities as well as those entities’ interactions. While self-attention-based pre-trained language encoders like GPT and BERT have been successfully applied across a range of natural language understanding tasks, their ability to handle the nuances of procedural texts is still unknown. In this paper, we explore the use of pre-trained transformer networks for entity tracking tasks in procedural text. First, we test standard lightweight approaches for prediction with pre-trained transformers, and find that these approaches underperforms even simple baselines. We show that much stronger results can be attained by restructuring the input to guide the model to focus on a particular entity. Second, we assess the degree to which the transformer networks capture the process dynamics, investigating such factors as merged entities and oblique entity references. On two different tasks, ingredient detection in recipes and QA over scientific processes, we achieve state-of-the-art results, but our models still largely attend to shallow context clues and do not form complex representations of intermediate process state.
Anthology ID:
D19-1070
Volume:
Proceedings of the 2019 Conference on Empirical Methods in Natural Language Processing and the 9th International Joint Conference on Natural Language Processing (EMNLP-IJCNLP)
Month:
November
Year:
2019
Address:
Hong Kong, China
Editors:
Kentaro Inui, Jing Jiang, Vincent Ng, Xiaojun Wan
Venues:
EMNLP | IJCNLP
SIG:
SIGDAT
Publisher:
Association for Computational Linguistics
Note:
Pages:
759–769
Language:
URL:
https://aclanthology.org/D19-1070
DOI:
10.18653/v1/D19-1070
Bibkey:
Cite (ACL):
Aditya Gupta and Greg Durrett. 2019. Effective Use of Transformer Networks for Entity Tracking. In Proceedings of the 2019 Conference on Empirical Methods in Natural Language Processing and the 9th International Joint Conference on Natural Language Processing (EMNLP-IJCNLP), pages 759–769, Hong Kong, China. Association for Computational Linguistics.
Cite (Informal):
Effective Use of Transformer Networks for Entity Tracking (Gupta & Durrett, EMNLP-IJCNLP 2019)
Copy Citation:
PDF:
https://aclanthology.org/D19-1070.pdf
Code
 aditya2211/transformer-entity-tracking