Retrofitting Structure-aware Transformer Language Model for End Tasks

Hao Fei, Yafeng Ren, Donghong Ji


Abstract
We consider retrofitting structure-aware Transformer language model for facilitating end tasks by proposing to exploit syntactic distance to encode both the phrasal constituency and dependency connection into the language model. A middle-layer structural learning strategy is leveraged for structure integration, accomplished with main semantic task training under multi-task learning scheme. Experimental results show that the retrofitted structure-aware Transformer language model achieves improved perplexity, meanwhile inducing accurate syntactic phrases. By performing structure-aware fine-tuning, our model achieves significant improvements for both semantic- and syntactic-dependent tasks.
Anthology ID:
2020.emnlp-main.168
Volume:
Proceedings of the 2020 Conference on Empirical Methods in Natural Language Processing (EMNLP)
Month:
November
Year:
2020
Address:
Online
Editors:
Bonnie Webber, Trevor Cohn, Yulan He, Yang Liu
Venue:
EMNLP
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
2151–2161
Language:
URL:
https://aclanthology.org/2020.emnlp-main.168
DOI:
10.18653/v1/2020.emnlp-main.168
Bibkey:
Cite (ACL):
Hao Fei, Yafeng Ren, and Donghong Ji. 2020. Retrofitting Structure-aware Transformer Language Model for End Tasks. In Proceedings of the 2020 Conference on Empirical Methods in Natural Language Processing (EMNLP), pages 2151–2161, Online. Association for Computational Linguistics.
Cite (Informal):
Retrofitting Structure-aware Transformer Language Model for End Tasks (Fei et al., EMNLP 2020)
Copy Citation:
PDF:
https://aclanthology.org/2020.emnlp-main.168.pdf
Video:
 https://slideslive.com/38938663
Data
SST