Probing Structural Knowledge from Pre-trained Language Model for Argumentation Relation Classification

Yang Sun, Bin Liang, Jianzhu Bao, Min Yang, Ruifeng Xu


Abstract
Extracting fine-grained structural information between argumentation component (AC) pairs is essential for argumentation relation classification (ARC). However, most previous studies attempt to model the relationship between AC pairs using AC level similarity or semantically relevant features. They ignore the complex interaction between AC pairs and cannot effectively reason the argumentation relation deeply. Therefore, in this paper, we propose a novel dual prior graph neural network (DPGNN) to jointly explore the probing knowledge derived from pre-trained language models (PLMs) and the syntactical information for comprehensively modeling the relationship between AC pairs. Specifically, we construct a probing graph by using probing knowledge derived from PLMs to recognize and align the relational information within and across the argumentation components. In addition, we propose a mutual dependency graph for the AC pair to reason the fine-grained syntactic structural information, in which the syntactical correlation between words is set by the dependency information within AC and mutual attention mechanism across ACs. The knowledge learned from the probing graph and the dependency graph are combined to comprehensively capture the aligned relationships of AC pairs for improving the results of ARC. Experimental results on three public datasets show that DPGNN outperforms the state-of-the-art baselines by a noticeable margin.
Anthology ID:
2022.findings-emnlp.264
Volume:
Findings of the Association for Computational Linguistics: EMNLP 2022
Month:
December
Year:
2022
Address:
Abu Dhabi, United Arab Emirates
Editors:
Yoav Goldberg, Zornitsa Kozareva, Yue Zhang
Venue:
Findings
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
3605–3615
Language:
URL:
https://aclanthology.org/2022.findings-emnlp.264
DOI:
10.18653/v1/2022.findings-emnlp.264
Bibkey:
Cite (ACL):
Yang Sun, Bin Liang, Jianzhu Bao, Min Yang, and Ruifeng Xu. 2022. Probing Structural Knowledge from Pre-trained Language Model for Argumentation Relation Classification. In Findings of the Association for Computational Linguistics: EMNLP 2022, pages 3605–3615, Abu Dhabi, United Arab Emirates. Association for Computational Linguistics.
Cite (Informal):
Probing Structural Knowledge from Pre-trained Language Model for Argumentation Relation Classification (Sun et al., Findings 2022)
Copy Citation:
PDF:
https://aclanthology.org/2022.findings-emnlp.264.pdf