Comparing DAE-based and MASS-based UNMT: Robustness to Word-Order Divergence in English–>Indic Language Pairs

Banerjee Tamali, Murthy Rudra, Bhattacharyya Pushpak


Abstract
The proliferation of fake news poses a significant challenge in the digital era. Detecting false information, especially in non-English languages, is crucial to combating misinformation effectively. In this research, we introduce a novel approach for Dravidian fake news detection by harnessing the capabilities of the MuRIL transformer model, further enhanced by gradient accumulation techniques. Our study focuses on the Dravidian languages, a diverse group of languages spoken in South India, which are often underserved in natural language processing research. We optimize memory usage, stabilize training, and improve the model’s overall performance by accumulating gradients over multiple batches. The proposed model exhibits promising results in terms of both accuracy and efficiency. Our findings underline the significance of adapting state-of-the-art techniques, such as MuRIL-based models and gradient accumulation, to non-English language.
Anthology ID:
2023.icon-1.44
Volume:
Proceedings of the 20th International Conference on Natural Language Processing (ICON)
Month:
December
Year:
2023
Address:
Goa University, Goa, India
Editors:
D. Pawar Jyoti, Lalitha Devi Sobha
Venue:
ICON
SIG:
SIGLEX
Publisher:
NLP Association of India (NLPAI)
Note:
Pages:
491–496
Language:
URL:
https://aclanthology.org/2023.icon-1.44
DOI:
Bibkey:
Cite (ACL):
Banerjee Tamali, Murthy Rudra, and Bhattacharyya Pushpak. 2023. Comparing DAE-based and MASS-based UNMT: Robustness to Word-Order Divergence in English–>Indic Language Pairs. In Proceedings of the 20th International Conference on Natural Language Processing (ICON), pages 491–496, Goa University, Goa, India. NLP Association of India (NLPAI).
Cite (Informal):
Comparing DAE-based and MASS-based UNMT: Robustness to Word-Order Divergence in English–>Indic Language Pairs (Tamali et al., ICON 2023)
Copy Citation:
PDF:
https://aclanthology.org/2023.icon-1.44.pdf