I am PsyAM: Modeling Happiness with Cognitive Appraisal Dimensions

Xuan Liu, Kokil Jaidka


Abstract
This paper proposes and evaluates PsyAM (https://anonymous.4open.science/r/BERT-PsyAM-10B9), a framework that incorporates adaptor modules in a sequential multi-task learning setup to generate high-dimensional feature representations of hedonic well-being (momentary happiness) in terms of its psychological underpinnings. PsyAM models emotion in text through its cognitive antecedents through auxiliary models that achieve multi-task learning through novel feature fusion methods. We show that BERT-PsyAM has cross-task validity and cross-domain generalizability through experiments with emotion-related tasks – on new emotion tasks and new datasets, as well as against traditional methods and BERT baselines. We further probe the robustness of BERT-PsyAM through feature ablation studies, as well as discuss the qualitative inferences we can draw regarding the effectiveness of the framework for representing emotional states. We close with a discussion of a future agenda of psychology-inspired neural network architectures.
Anthology ID:
2023.findings-acl.77
Volume:
Findings of the Association for Computational Linguistics: ACL 2023
Month:
July
Year:
2023
Address:
Toronto, Canada
Editors:
Anna Rogers, Jordan Boyd-Graber, Naoaki Okazaki
Venue:
Findings
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
1192–1210
Language:
URL:
https://aclanthology.org/2023.findings-acl.77
DOI:
10.18653/v1/2023.findings-acl.77
Bibkey:
Cite (ACL):
Xuan Liu and Kokil Jaidka. 2023. I am PsyAM: Modeling Happiness with Cognitive Appraisal Dimensions. In Findings of the Association for Computational Linguistics: ACL 2023, pages 1192–1210, Toronto, Canada. Association for Computational Linguistics.
Cite (Informal):
I am PsyAM: Modeling Happiness with Cognitive Appraisal Dimensions (Liu & Jaidka, Findings 2023)
Copy Citation:
PDF:
https://aclanthology.org/2023.findings-acl.77.pdf