Hi-ArG: Exploring the Integration of Hierarchical Argumentation Graphs in Language Pretraining

Jingcong Liang, Rong Ye, Meng Han, Qi Zhang, Ruofei Lai, Xinyu Zhang, Zhao Cao, Xuanjing Huang, Zhongyu Wei


Abstract
The knowledge graph is a structure to store and represent knowledge, and recent studies have discussed its capability to assist language models for various applications. Some variations of knowledge graphs aim to record arguments and their relations for computational argumentation tasks. However, many must simplify semantic types to fit specific schemas, thus losing flexibility and expression ability. In this paper, we propose the **Hi**erarchical **Ar**gumentation **G**raph (Hi-ArG), a new structure to organize arguments. We also introduce two approaches to exploit Hi-ArG, including a text-graph multi-modal model GreaseArG and a new pre-training framework augmented with graph information. Experiments on two argumentation tasks have shown that after further pre-training and fine-tuning, GreaseArG supersedes same-scale language models on these tasks, while incorporating graph information during further pre-training can also improve the performance of vanilla language models. Code for this paper is available at <https://github.com/ljcleo/Hi-ArG>.
Anthology ID:
2023.emnlp-main.902
Volume:
Proceedings of the 2023 Conference on Empirical Methods in Natural Language Processing
Month:
December
Year:
2023
Address:
Singapore
Editors:
Houda Bouamor, Juan Pino, Kalika Bali
Venue:
EMNLP
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
14606–14620
Language:
URL:
https://aclanthology.org/2023.emnlp-main.902
DOI:
10.18653/v1/2023.emnlp-main.902
Bibkey:
Cite (ACL):
Jingcong Liang, Rong Ye, Meng Han, Qi Zhang, Ruofei Lai, Xinyu Zhang, Zhao Cao, Xuanjing Huang, and Zhongyu Wei. 2023. Hi-ArG: Exploring the Integration of Hierarchical Argumentation Graphs in Language Pretraining. In Proceedings of the 2023 Conference on Empirical Methods in Natural Language Processing, pages 14606–14620, Singapore. Association for Computational Linguistics.
Cite (Informal):
Hi-ArG: Exploring the Integration of Hierarchical Argumentation Graphs in Language Pretraining (Liang et al., EMNLP 2023)
Copy Citation:
PDF:
https://aclanthology.org/2023.emnlp-main.902.pdf
Video:
 https://aclanthology.org/2023.emnlp-main.902.mp4