Using Convolution Neural Network with BERT for Stance Detection in Vietnamese

Oanh Tran, Anh Cong Phung, Bach Xuan Ngo


Abstract
Stance detection is the task of automatically eliciting stance information towards a specific claim made by a primary author. While most studies have been done for high-resource languages, this work is dedicated to a low-resource language, namely Vietnamese. In this paper, we propose an architecture using transformers to detect stances in Vietnamese claims. This architecture exploits BERT to extract contextual word embeddings instead of using traditional word2vec models. Then, these embeddings are fed into CNN networks to extract local features to train the stance detection model. We performed extensive comparison experiments to show the effectiveness of the proposed method on a public dataset1 Experimental results show that this proposed model outperforms the previous methods by a large margin. It yielded an accuracy score of 75.57% averaged on four labels. This sets a new SOTA result for future research on this interesting problem in Vietnamese.
Anthology ID:
2022.lrec-1.783
Volume:
Proceedings of the Thirteenth Language Resources and Evaluation Conference
Month:
June
Year:
2022
Address:
Marseille, France
Editors:
Nicoletta Calzolari, Frédéric Béchet, Philippe Blache, Khalid Choukri, Christopher Cieri, Thierry Declerck, Sara Goggi, Hitoshi Isahara, Bente Maegaard, Joseph Mariani, Hélène Mazo, Jan Odijk, Stelios Piperidis
Venue:
LREC
SIG:
Publisher:
European Language Resources Association
Note:
Pages:
7220–7225
Language:
URL:
https://aclanthology.org/2022.lrec-1.783
DOI:
Bibkey:
Cite (ACL):
Oanh Tran, Anh Cong Phung, and Bach Xuan Ngo. 2022. Using Convolution Neural Network with BERT for Stance Detection in Vietnamese. In Proceedings of the Thirteenth Language Resources and Evaluation Conference, pages 7220–7225, Marseille, France. European Language Resources Association.
Cite (Informal):
Using Convolution Neural Network with BERT for Stance Detection in Vietnamese (Tran et al., LREC 2022)
Copy Citation:
PDF:
https://aclanthology.org/2022.lrec-1.783.pdf