An Empirical Study of Pipeline vs. Joint approaches to Entity and Relation Extraction

Zhaohui Yan, Zixia Jia, Kewei Tu


Abstract
The Entity and Relation Extraction (ERE) task includes two basic sub-tasks: Named Entity Recognition and Relation Extraction. In the last several years, much work focused on joint approaches for the common perception that the pipeline approach suffers from the error propagation problem. Recent work reconsiders the pipeline scheme and shows that it can produce comparable results. To systematically study the pros and cons of these two schemes. We design and test eight pipeline and joint approaches to the ERE task. We find that with the same span representation methods, the best joint approach still outperforms the best pipeline model, but improperly designed joint approaches may have poor performance. We hope our work could shed some light on the pipeline-vs-joint debate of the ERE task and inspire further research.
Anthology ID:
2022.aacl-short.55
Volume:
Proceedings of the 2nd Conference of the Asia-Pacific Chapter of the Association for Computational Linguistics and the 12th International Joint Conference on Natural Language Processing (Volume 2: Short Papers)
Month:
November
Year:
2022
Address:
Online only
Editors:
Yulan He, Heng Ji, Sujian Li, Yang Liu, Chua-Hui Chang
Venues:
AACL | IJCNLP
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
437–443
Language:
URL:
https://aclanthology.org/2022.aacl-short.55
DOI:
Bibkey:
Cite (ACL):
Zhaohui Yan, Zixia Jia, and Kewei Tu. 2022. An Empirical Study of Pipeline vs. Joint approaches to Entity and Relation Extraction. In Proceedings of the 2nd Conference of the Asia-Pacific Chapter of the Association for Computational Linguistics and the 12th International Joint Conference on Natural Language Processing (Volume 2: Short Papers), pages 437–443, Online only. Association for Computational Linguistics.
Cite (Informal):
An Empirical Study of Pipeline vs. Joint approaches to Entity and Relation Extraction (Yan et al., AACL-IJCNLP 2022)
Copy Citation:
PDF:
https://aclanthology.org/2022.aacl-short.55.pdf