Uncovering Main Causalities for Long-tailed Information Extraction

Guoshun Nan, Jiaqi Zeng, Rui Qiao, Zhijiang Guo, Wei Lu


Abstract
Information Extraction (IE) aims to extract structural information from unstructured texts. In practice, long-tailed distributions caused by the selection bias of a dataset may lead to incorrect correlations, also known as spurious correlations, between entities and labels in the conventional likelihood models. This motivates us to propose counterfactual IE (CFIE), a novel framework that aims to uncover the main causalities behind data in the view of causal inference. Specifically, 1) we first introduce a unified structural causal model (SCM) for various IE tasks, describing the relationships among variables; 2) with our SCM, we then generate counterfactuals based on an explicit language structure to better calculate the direct causal effect during the inference stage; 3) we further propose a novel debiasing approach to yield more robust predictions. Experiments on three IE tasks across five public datasets show the effectiveness of our CFIE model in mitigating the spurious correlation issues.
Anthology ID:
2021.emnlp-main.763
Volume:
Proceedings of the 2021 Conference on Empirical Methods in Natural Language Processing
Month:
November
Year:
2021
Address:
Online and Punta Cana, Dominican Republic
Editors:
Marie-Francine Moens, Xuanjing Huang, Lucia Specia, Scott Wen-tau Yih
Venue:
EMNLP
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
9683–9695
Language:
URL:
https://aclanthology.org/2021.emnlp-main.763
DOI:
10.18653/v1/2021.emnlp-main.763
Bibkey:
Cite (ACL):
Guoshun Nan, Jiaqi Zeng, Rui Qiao, Zhijiang Guo, and Wei Lu. 2021. Uncovering Main Causalities for Long-tailed Information Extraction. In Proceedings of the 2021 Conference on Empirical Methods in Natural Language Processing, pages 9683–9695, Online and Punta Cana, Dominican Republic. Association for Computational Linguistics.
Cite (Informal):
Uncovering Main Causalities for Long-tailed Information Extraction (Nan et al., EMNLP 2021)
Copy Citation:
PDF:
https://aclanthology.org/2021.emnlp-main.763.pdf
Video:
 https://aclanthology.org/2021.emnlp-main.763.mp4
Code
 heyyyyyyg/cfie
Data
MAVENOntoNotes 5.0