Semantic Similarity Models for Depression Severity Estimation

Anxo Pérez, Neha Warikoo, Kexin Wang, Javier Parapar, Iryna Gurevych


Abstract
Depressive disorders constitute a severe public health issue worldwide. However, public health systems have limited capacity for case detection and diagnosis. In this regard, the widespread use of social media has opened up a way to access public information on a large scale. Computational methods can serve as support tools for rapid screening by exploiting this user-generated social media content. This paper presents an efficient semantic pipeline to study depression severity in individuals based on their social media writings. We select test user sentences for producing semantic rankings over an index of representative training sentences corresponding to depressive symptoms and severity levels. Then, we use the sentences from those results as evidence for predicting symptoms severity. For that, we explore different aggregation methods to answer one of four Beck Depression Inventory (BDI-II) options per symptom. We evaluate our methods on two Reddit-based benchmarks, achieving improvement over state of the art in terms of measuring depression level.
Anthology ID:
2023.emnlp-main.1000
Volume:
Proceedings of the 2023 Conference on Empirical Methods in Natural Language Processing
Month:
December
Year:
2023
Address:
Singapore
Editors:
Houda Bouamor, Juan Pino, Kalika Bali
Venue:
EMNLP
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
16104–16118
Language:
URL:
https://aclanthology.org/2023.emnlp-main.1000
DOI:
10.18653/v1/2023.emnlp-main.1000
Bibkey:
Cite (ACL):
Anxo Pérez, Neha Warikoo, Kexin Wang, Javier Parapar, and Iryna Gurevych. 2023. Semantic Similarity Models for Depression Severity Estimation. In Proceedings of the 2023 Conference on Empirical Methods in Natural Language Processing, pages 16104–16118, Singapore. Association for Computational Linguistics.
Cite (Informal):
Semantic Similarity Models for Depression Severity Estimation (Pérez et al., EMNLP 2023)
Copy Citation:
PDF:
https://aclanthology.org/2023.emnlp-main.1000.pdf
Video:
 https://aclanthology.org/2023.emnlp-main.1000.mp4