Efficient Data Learning for Open Information Extraction with Pre-trained Language Models

Zhiyuan Fan, Shizhu He


Abstract
Open Information Extraction (OpenIE) is a fundamental yet challenging task in Natural Language Processing, which involves extracting all triples (subject, predicate, object) from a given sentence. While labelling-based methods have their merits, generation-based techniques offer unique advantages, such as the ability to generate tokens not present in the original sentence. However, these generation-based methods often require a significant amount of training data to learn the task form of OpenIE and substantial training time to overcome slow model convergence due to the order penalty. In this paper, we introduce a novel framework, OK-IE, that ingeniously transforms the task form of OpenIE into the pre-training task form of the T5 model, thereby reducing the need for extensive training data. Furthermore, we introduce an innovative concept of ‘anchors’ to control the sequence of model outputs, effectively eliminating the impact of order penalty on model convergence and significantly reducing training time. Experimental results indicate that, compared to previous SOTA methods, OK-IE requires only 1/100 of the training data (900 instances) and 1/120 of the training time (3 minutes) to achieve comparable results.
Anthology ID:
2023.findings-emnlp.869
Volume:
Findings of the Association for Computational Linguistics: EMNLP 2023
Month:
December
Year:
2023
Address:
Singapore
Editors:
Houda Bouamor, Juan Pino, Kalika Bali
Venue:
Findings
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
13056–13063
Language:
URL:
https://aclanthology.org/2023.findings-emnlp.869
DOI:
10.18653/v1/2023.findings-emnlp.869
Bibkey:
Cite (ACL):
Zhiyuan Fan and Shizhu He. 2023. Efficient Data Learning for Open Information Extraction with Pre-trained Language Models. In Findings of the Association for Computational Linguistics: EMNLP 2023, pages 13056–13063, Singapore. Association for Computational Linguistics.
Cite (Informal):
Efficient Data Learning for Open Information Extraction with Pre-trained Language Models (Fan & He, Findings 2023)
Copy Citation:
PDF:
https://aclanthology.org/2023.findings-emnlp.869.pdf