CTYUN-AI@SMM4H-2024: Knowledge Extension Makes Expert Models

Yuming Fan, Dongming Yang, Lina Cao


Abstract
This paper explores the potential of social media as a rich source of data for understanding public health trends and behaviors, particularly focusing on emotional well-being and the impact of environmental factors. We employed large language models (LLMs) and developed a suite of knowledge extension techniques to analyze social media content related to mental health issues, specifically examining 1) effects of outdoor spaces on social anxiety symptoms in Reddit,2) tweets reporting children’s medical disorders, and 3) self-reported ages in posts of Twitter and Reddit. Our knowledge extension approach encompasses both supervised data (i.e., sample augmentation and cross-task fine-tuning) and unsupervised data (i.e., knowledge distillation and cross-task pre-training), tackling the inherent challenges of sample imbalance and informality of social media language. The effectiveness of our approach is demonstrated by the superior performance across multiple tasks (i.e., Task 3, 5 and 6) at the SMM4H-2024. Notably, we achieved the best performance in all three tasks, underscoring the utility of our models in real-world applications.
Anthology ID:
2024.smm4h-1.2
Volume:
Proceedings of The 9th Social Media Mining for Health Research and Applications (SMM4H 2024) Workshop and Shared Tasks
Month:
August
Year:
2024
Address:
Bangkok, Thailand
Editors:
Dongfang Xu, Graciela Gonzalez-Hernandez
Venues:
SMM4H | WS
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
5–9
Language:
URL:
https://aclanthology.org/2024.smm4h-1.2
DOI:
Bibkey:
Cite (ACL):
Yuming Fan, Dongming Yang, and Lina Cao. 2024. CTYUN-AI@SMM4H-2024: Knowledge Extension Makes Expert Models. In Proceedings of The 9th Social Media Mining for Health Research and Applications (SMM4H 2024) Workshop and Shared Tasks, pages 5–9, Bangkok, Thailand. Association for Computational Linguistics.
Cite (Informal):
CTYUN-AI@SMM4H-2024: Knowledge Extension Makes Expert Models (Fan et al., SMM4H-WS 2024)
Copy Citation:
PDF:
https://aclanthology.org/2024.smm4h-1.2.pdf