Domain Adaptation via Prompt Learning for Alzheimer’s Detection

Shahla Farzana, Natalie Parde


Abstract
Spoken language presents a compelling medium for non-invasive Alzheimer’s disease (AD) screening, and prior work has examined the use of fine-tuned pretrained language models (PLMs) for this purpose. However, PLMs are often optimized on tasks that are inconsistent with AD classification. Spoken language corpora for AD detection are also small and disparate, making generalizability difficult. This paper investigates the use of domain-adaptive prompt fine-tuning for AD detection, using AD classification loss as the training objective and leveraging spoken language corpora from a variety of language tasks. Extensive experiments using voting-based combinations of different prompting paradigms show an impressive mean detection F1=0.8952 (with std=0.01 and best F1=0.9130) for the highest-performing approach when using BERT as the base PLM.
Anthology ID:
2024.findings-emnlp.937
Volume:
Findings of the Association for Computational Linguistics: EMNLP 2024
Month:
November
Year:
2024
Address:
Miami, Florida, USA
Editors:
Yaser Al-Onaizan, Mohit Bansal, Yun-Nung Chen
Venue:
Findings
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
15963–15976
Language:
URL:
https://aclanthology.org/2024.findings-emnlp.937/
DOI:
10.18653/v1/2024.findings-emnlp.937
Bibkey:
Cite (ACL):
Shahla Farzana and Natalie Parde. 2024. Domain Adaptation via Prompt Learning for Alzheimer’s Detection. In Findings of the Association for Computational Linguistics: EMNLP 2024, pages 15963–15976, Miami, Florida, USA. Association for Computational Linguistics.
Cite (Informal):
Domain Adaptation via Prompt Learning for Alzheimer’s Detection (Farzana & Parde, Findings 2024)
Copy Citation:
PDF:
https://aclanthology.org/2024.findings-emnlp.937.pdf