Detecting Incongruent News Articles Using Multi-head Attention Dual Summarization

Sujit Kumar, Gaurav Kumar, Sanasam Ranbir Singh


Abstract
With the increasing use of influencing incongruent news headlines for spreading fake news, detecting incongruent news articles has become an important research challenge. Most of the earlier studies on incongruity detection focus on estimating the similarity between the headline and the encoding of the body or its summary. However, most of these methods fail to handle incongruent news articles created with embedded noise. Motivated by the above issue, this paper proposes a Multi-head Attention Dual Summary (MADS) based method which generates two types of summaries that capture the congruent and incongruent parts in the body separately. From various experimental setups over three publicly available datasets, it is evident that the proposed model outperforms the state-of-the-art baseline counterparts.
Anthology ID:
2022.aacl-main.70
Volume:
Proceedings of the 2nd Conference of the Asia-Pacific Chapter of the Association for Computational Linguistics and the 12th International Joint Conference on Natural Language Processing (Volume 1: Long Papers)
Month:
November
Year:
2022
Address:
Online only
Editors:
Yulan He, Heng Ji, Sujian Li, Yang Liu, Chua-Hui Chang
Venues:
AACL | IJCNLP
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
967–977
Language:
URL:
https://aclanthology.org/2022.aacl-main.70
DOI:
Bibkey:
Cite (ACL):
Sujit Kumar, Gaurav Kumar, and Sanasam Ranbir Singh. 2022. Detecting Incongruent News Articles Using Multi-head Attention Dual Summarization. In Proceedings of the 2nd Conference of the Asia-Pacific Chapter of the Association for Computational Linguistics and the 12th International Joint Conference on Natural Language Processing (Volume 1: Long Papers), pages 967–977, Online only. Association for Computational Linguistics.
Cite (Informal):
Detecting Incongruent News Articles Using Multi-head Attention Dual Summarization (Kumar et al., AACL-IJCNLP 2022)
Copy Citation:
PDF:
https://aclanthology.org/2022.aacl-main.70.pdf