Generating Commonsense Counterfactuals for Stable Relation Extraction

Xin Miao, Yongqi Li, Tieyun Qian


Abstract
Recent studies on counterfactual augmented data have achieved great success in the coarse-grained natural language processing tasks. However, existing methods encounter two major problems when dealing with the fine-grained relation extraction tasks. One is that they struggle to accurately identify causal terms under the invariant entity constraint. The other is that they ignore the commonsense constraint. To solve these problems, we propose a novel framework to generate commonsense counterfactuals for stable relation extraction. Specifically, to identify causal terms accurately, we introduce an intervention-based strategy and leverage a constituency parser for correction. To satisfy the commonsense constraint, we introduce the concept knowledge base WordNet and design a bottom-up relation expansion algorithm on it to uncover commonsense relations between entities. We conduct a series of comprehensive evaluations, including the low-resource, out-of-domain, and adversarial-attack settings. The results demonstrate that our framework significantly enhances the stability of base relation extraction models.
Anthology ID:
2023.emnlp-main.344
Volume:
Proceedings of the 2023 Conference on Empirical Methods in Natural Language Processing
Month:
December
Year:
2023
Address:
Singapore
Editors:
Houda Bouamor, Juan Pino, Kalika Bali
Venue:
EMNLP
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
5654–5668
Language:
URL:
https://aclanthology.org/2023.emnlp-main.344
DOI:
10.18653/v1/2023.emnlp-main.344
Bibkey:
Cite (ACL):
Xin Miao, Yongqi Li, and Tieyun Qian. 2023. Generating Commonsense Counterfactuals for Stable Relation Extraction. In Proceedings of the 2023 Conference on Empirical Methods in Natural Language Processing, pages 5654–5668, Singapore. Association for Computational Linguistics.
Cite (Informal):
Generating Commonsense Counterfactuals for Stable Relation Extraction (Miao et al., EMNLP 2023)
Copy Citation:
PDF:
https://aclanthology.org/2023.emnlp-main.344.pdf
Video:
 https://aclanthology.org/2023.emnlp-main.344.mp4