Enhancing Social Media Health Prediction Certainty by Integrating Large Language Models with Transformer Classifiers

Sedigh Khademi, Christopher Palmer, Muhammad Javed, Jim Buttery, Gerardo Dimaguila


Abstract
This paper presents our approach for SMM4H 2024 Task 5, focusing on identifying tweets where users discuss their child’s health conditions of ADHD, ASD, delayed speech, or asthma. Our approach uses a pipeline that combines transformer-based classifiers and GPT-4 large language models (LLMs). We first address data imbalance in the training set using topic modelling and under-sampling. Next, we train RoBERTa-based classifiers on the adjusted data. Finally, GPT-4 refines the classifier’s predictions for uncertain cases (confidence below 0.9). This strategy achieved significant improvement over the baseline RoBERTa models. Our work demonstrates the effectiveness of combining transformer classifiers and LLMs for extracting health insights from social media conversations.
Anthology ID:
2024.smm4h-1.16
Volume:
Proceedings of The 9th Social Media Mining for Health Research and Applications (SMM4H 2024) Workshop and Shared Tasks
Month:
August
Year:
2024
Address:
Bangkok, Thailand
Editors:
Dongfang Xu, Graciela Gonzalez-Hernandez
Venues:
SMM4H | WS
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
71–73
Language:
URL:
https://aclanthology.org/2024.smm4h-1.16
DOI:
Bibkey:
Cite (ACL):
Sedigh Khademi, Christopher Palmer, Muhammad Javed, Jim Buttery, and Gerardo Dimaguila. 2024. Enhancing Social Media Health Prediction Certainty by Integrating Large Language Models with Transformer Classifiers. In Proceedings of The 9th Social Media Mining for Health Research and Applications (SMM4H 2024) Workshop and Shared Tasks, pages 71–73, Bangkok, Thailand. Association for Computational Linguistics.
Cite (Informal):
Enhancing Social Media Health Prediction Certainty by Integrating Large Language Models with Transformer Classifiers (Khademi et al., SMM4H-WS 2024)
Copy Citation:
PDF:
https://aclanthology.org/2024.smm4h-1.16.pdf