Multiple Evidence Combination for Fact-Checking of Health-Related Information

Pritam Deka, Anna Jurek-Loughrey, Deepak P


Abstract
Fact-checking of health-related claims has become necessary in this digital age, where any information posted online is easily available to everyone. The most effective way to verify such claims is by using evidences obtained from reliable sources of medical knowledge, such as PubMed. Recent advances in the field of NLP have helped automate such fact-checking tasks. In this work, we propose a domain-specific BERT-based model using a transfer learning approach for the task of predicting the veracity of claim-evidence pairs for the verification of health-related facts. We also improvise on a method to combine multiple evidences retrieved for a single claim, taking into consideration conflicting evidences as well. We also show how our model can be exploited when labelled data is available and how back-translation can be used to augment data when there is data scarcity.
Anthology ID:
2023.bionlp-1.20
Volume:
The 22nd Workshop on Biomedical Natural Language Processing and BioNLP Shared Tasks
Month:
July
Year:
2023
Address:
Toronto, Canada
Editors:
Dina Demner-fushman, Sophia Ananiadou, Kevin Cohen
Venue:
BioNLP
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
237–247
Language:
URL:
https://aclanthology.org/2023.bionlp-1.20
DOI:
10.18653/v1/2023.bionlp-1.20
Bibkey:
Cite (ACL):
Pritam Deka, Anna Jurek-Loughrey, and Deepak P. 2023. Multiple Evidence Combination for Fact-Checking of Health-Related Information. In The 22nd Workshop on Biomedical Natural Language Processing and BioNLP Shared Tasks, pages 237–247, Toronto, Canada. Association for Computational Linguistics.
Cite (Informal):
Multiple Evidence Combination for Fact-Checking of Health-Related Information (Deka et al., BioNLP 2023)
Copy Citation:
PDF:
https://aclanthology.org/2023.bionlp-1.20.pdf