Inferring Social Media Users’ Mental Health Status from Multimodal Information

Zhentao Xu, Verónica Pérez-Rosas, Rada Mihalcea


Abstract
Worldwide, an increasing number of people are suffering from mental health disorders such as depression and anxiety. In the United States alone, one in every four adults suffers from a mental health condition, which makes mental health a pressing concern. In this paper, we explore the use of multimodal cues present in social media posts to predict users’ mental health status. Specifically, we focus on identifying social media activity that either indicates a mental health condition or its onset. We collect posts from Flickr and apply a multimodal approach that consists of jointly analyzing language, visual, and metadata cues and their relation to mental health. We conduct several classification experiments aiming to discriminate between (1) healthy users and users affected by a mental health illness; and (2) healthy users and users prone to mental illness. Our experimental results indicate that using multiple modalities can improve the performance of this classification task as compared to the use of one modality at a time, and can provide important cues into a user’s mental status.
Anthology ID:
2020.lrec-1.772
Volume:
Proceedings of the 12th Language Resources and Evaluation Conference
Month:
May
Year:
2020
Address:
Marseille, France
Venue:
LREC
SIG:
Publisher:
European Language Resources Association
Note:
Pages:
6292–6299
Language:
English
URL:
https://aclanthology.org/2020.lrec-1.772
DOI:
Bibkey:
Cite (ACL):
Zhentao Xu, Verónica Pérez-Rosas, and Rada Mihalcea. 2020. Inferring Social Media Users’ Mental Health Status from Multimodal Information. In Proceedings of the 12th Language Resources and Evaluation Conference, pages 6292–6299, Marseille, France. European Language Resources Association.
Cite (Informal):
Inferring Social Media Users’ Mental Health Status from Multimodal Information (Xu et al., LREC 2020)
Copy Citation:
PDF:
https://aclanthology.org/2020.lrec-1.772.pdf