Multi-task and Multi-corpora Training Strategies to Enhance Argumentative Sentence Linking Performance

Jan Wira Gotama Putra, Simone Teufel, Takenobu Tokunaga


Abstract
Argumentative structure prediction aims to establish links between textual units and label the relationship between them, forming a structured representation for a given input text. The former task, linking, has been identified by earlier works as particularly challenging, as it requires finding the most appropriate structure out of a very large search space of possible link combinations. In this paper, we improve a state-of-the-art linking model by using multi-task and multi-corpora training strategies. Our auxiliary tasks help the model to learn the role of each sentence in the argumentative structure. Combining multi-corpora training with a selective sampling strategy increases the training data size while ensuring that the model still learns the desired target distribution well. Experiments on essays written by English-as-a-foreign-language learners show that both strategies significantly improve the model’s performance; for instance, we observe a 15.8% increase in the F1-macro for individual link predictions.
Anthology ID:
2021.argmining-1.2
Volume:
Proceedings of the 8th Workshop on Argument Mining
Month:
November
Year:
2021
Address:
Punta Cana, Dominican Republic
Editors:
Khalid Al-Khatib, Yufang Hou, Manfred Stede
Venue:
ArgMining
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
12–23
Language:
URL:
https://aclanthology.org/2021.argmining-1.2
DOI:
10.18653/v1/2021.argmining-1.2
Bibkey:
Cite (ACL):
Jan Wira Gotama Putra, Simone Teufel, and Takenobu Tokunaga. 2021. Multi-task and Multi-corpora Training Strategies to Enhance Argumentative Sentence Linking Performance. In Proceedings of the 8th Workshop on Argument Mining, pages 12–23, Punta Cana, Dominican Republic. Association for Computational Linguistics.
Cite (Informal):
Multi-task and Multi-corpora Training Strategies to Enhance Argumentative Sentence Linking Performance (Putra et al., ArgMining 2021)
Copy Citation:
PDF:
https://aclanthology.org/2021.argmining-1.2.pdf
Code
 wiragotama/argmin2021