Hierarchical Attention Network for Explainable Depression Detection on Twitter Aided by Metaphor Concept Mappings

Sooji Han, Rui Mao, Erik Cambria


Abstract
Automatic depression detection on Twitter can help individuals privately and conveniently understand their mental health status in the early stages before seeing mental health professionals. Most existing black-box-like deep learning methods for depression detection largely focused on improving classification performance. However, explaining model decisions is imperative in health research because decision-making can often be high-stakes and life-and-death. Reliable automatic diagnosis of mental health problems including depression should be supported by credible explanations justifying models’ predictions. In this work, we propose a novel explainable model for depression detection on Twitter. It comprises a novel encoder combining hierarchical attention mechanisms and feed-forward neural networks. To support psycholinguistic studies, our model leverages metaphorical concept mappings as input. Thus, it not only detects depressed individuals, but also identifies features of such users’ tweets and associated metaphor concept mappings.
Anthology ID:
2022.coling-1.9
Volume:
Proceedings of the 29th International Conference on Computational Linguistics
Month:
October
Year:
2022
Address:
Gyeongju, Republic of Korea
Venue:
COLING
SIG:
Publisher:
International Committee on Computational Linguistics
Note:
Pages:
94–104
Language:
URL:
https://aclanthology.org/2022.coling-1.9
DOI:
Bibkey:
Cite (ACL):
Sooji Han, Rui Mao, and Erik Cambria. 2022. Hierarchical Attention Network for Explainable Depression Detection on Twitter Aided by Metaphor Concept Mappings. In Proceedings of the 29th International Conference on Computational Linguistics, pages 94–104, Gyeongju, Republic of Korea. International Committee on Computational Linguistics.
Cite (Informal):
Hierarchical Attention Network for Explainable Depression Detection on Twitter Aided by Metaphor Concept Mappings (Han et al., COLING 2022)
Copy Citation:
PDF:
https://aclanthology.org/2022.coling-1.9.pdf