Word-Level Loss Extensions for Neural Temporal Relation Classification

Artuur Leeuwenberg, Marie-Francine Moens


Abstract
Unsupervised pre-trained word embeddings are used effectively for many tasks in natural language processing to leverage unlabeled textual data. Often these embeddings are either used as initializations or as fixed word representations for task-specific classification models. In this work, we extend our classification model’s task loss with an unsupervised auxiliary loss on the word-embedding level of the model. This is to ensure that the learned word representations contain both task-specific features, learned from the supervised loss component, and more general features learned from the unsupervised loss component. We evaluate our approach on the task of temporal relation extraction, in particular, narrative containment relation extraction from clinical records, and show that continued training of the embeddings on the unsupervised objective together with the task objective gives better task-specific embeddings, and results in an improvement over the state of the art on the THYME dataset, using only a general-domain part-of-speech tagger as linguistic resource.
Anthology ID:
C18-1291
Volume:
Proceedings of the 27th International Conference on Computational Linguistics
Month:
August
Year:
2018
Address:
Santa Fe, New Mexico, USA
Editors:
Emily M. Bender, Leon Derczynski, Pierre Isabelle
Venue:
COLING
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
3436–3447
Language:
URL:
https://aclanthology.org/C18-1291
DOI:
Bibkey:
Cite (ACL):
Artuur Leeuwenberg and Marie-Francine Moens. 2018. Word-Level Loss Extensions for Neural Temporal Relation Classification. In Proceedings of the 27th International Conference on Computational Linguistics, pages 3436–3447, Santa Fe, New Mexico, USA. Association for Computational Linguistics.
Cite (Informal):
Word-Level Loss Extensions for Neural Temporal Relation Classification (Leeuwenberg & Moens, COLING 2018)
Copy Citation:
PDF:
https://aclanthology.org/C18-1291.pdf
Code
 tuur/WLLETlinkClassification