DeepStruct: Pretraining of Language Models for Structure Prediction

Chenguang Wang, Xiao Liu, Zui Chen, Haoyun Hong, Jie Tang, Dawn Song


Abstract
We introduce a method for improving the structural understanding abilities of language models. Unlike previous approaches that finetune the models with task-specific augmentation, we pretrain language models to generate structures from the text on a collection of task-agnostic corpora. Our structure pretraining enables zero-shot transfer of the learned knowledge that models have about the structure tasks. We study the performance of this approach on 28 datasets, spanning 10 structure prediction tasks including open information extraction, joint entity and relation extraction, named entity recognition, relation classification, semantic role labeling, event extraction, coreference resolution, factual probe, intent detection, and dialogue state tracking. We further enhance the pretraining with the task-specific training sets. We show that a 10B parameter language model transfers non-trivially to most tasks and obtains state-of-the-art performance on 21 of 28 datasets that we evaluate. Our code and datasets will be made publicly available.
Anthology ID:
2022.findings-acl.67
Original:
2022.findings-acl.67v1
Version 2:
2022.findings-acl.67v2
Volume:
Findings of the Association for Computational Linguistics: ACL 2022
Month:
May
Year:
2022
Address:
Dublin, Ireland
Editors:
Smaranda Muresan, Preslav Nakov, Aline Villavicencio
Venue:
Findings
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
803–823
Language:
URL:
https://aclanthology.org/2022.findings-acl.67
DOI:
10.18653/v1/2022.findings-acl.67
Bibkey:
Cite (ACL):
Chenguang Wang, Xiao Liu, Zui Chen, Haoyun Hong, Jie Tang, and Dawn Song. 2022. DeepStruct: Pretraining of Language Models for Structure Prediction. In Findings of the Association for Computational Linguistics: ACL 2022, pages 803–823, Dublin, Ireland. Association for Computational Linguistics.
Cite (Informal):
DeepStruct: Pretraining of Language Models for Structure Prediction (Wang et al., Findings 2022)
Copy Citation:
PDF:
https://aclanthology.org/2022.findings-acl.67.pdf
Code
 cgraywang/deepstruct
Data
ACE 2005ATISCoNLLCoNLL 2003CoNLL++CoNLL04FewRelGENIAKELMMultiWOZNew York Times Annotated CorpusOIE2016OPIECOntoNotes 5.0Penn TreebankSNIPST-RExTACREDTekGen