Dravidian Fake News Detection with Gradient Accumulation based Transformer Model

Raja Eduri, Soni Badal, Borgohain Samir Kumar, Lalrempuii Candy


Abstract
The proliferation of fake news poses a significant challenge in the digital era. Detecting false information, especially in non-English languages, is crucial to combating misinformation effectively. In this research, we introduce a novel approach for Dravidian fake news detection by harnessing the capabilities of the MuRIL transformer model, further enhanced by gradient accumulation techniques. Our study focuses on the Dravidian languages, a diverse group of languages spoken in South India, which are often underserved in natural language processing research. We optimize memory usage, stabilize training, and improve the model’s overall performance by accumulating gradients over multiple batches. The proposed model exhibits promising results in terms of both accuracy and efficiency. Our findings underline the significance of adapting state-ofthe-art techniques, such as MuRIL-based models and gradient accumulation, to non-English languages to address the pressing issue of fake news.
Anthology ID:
2023.icon-1.40
Volume:
Proceedings of the 20th International Conference on Natural Language Processing (ICON)
Month:
December
Year:
2023
Address:
Goa University, Goa, India
Editors:
D. Pawar Jyoti, Lalitha Devi Sobha
Venue:
ICON
SIG:
SIGLEX
Publisher:
NLP Association of India (NLPAI)
Note:
Pages:
466–471
Language:
URL:
https://aclanthology.org/2023.icon-1.40
DOI:
Bibkey:
Cite (ACL):
Raja Eduri, Soni Badal, Borgohain Samir Kumar, and Lalrempuii Candy. 2023. Dravidian Fake News Detection with Gradient Accumulation based Transformer Model. In Proceedings of the 20th International Conference on Natural Language Processing (ICON), pages 466–471, Goa University, Goa, India. NLP Association of India (NLPAI).
Cite (Informal):
Dravidian Fake News Detection with Gradient Accumulation based Transformer Model (Eduri et al., ICON 2023)
Copy Citation:
PDF:
https://aclanthology.org/2023.icon-1.40.pdf