Confidence-based Ensembling of Perspective-aware Models

Silvia Casola, Soda Lo, Valerio Basile, Simona Frenda, Alessandra Cignarella, Viviana Patti, Cristina Bosco


Abstract
Research in the field of NLP has recently focused on the variability that people show in selecting labels when performing an annotation task. Exploiting disagreements in annotations has been shown to offer advantages for accurate modelling and fair evaluation. In this paper, we propose a strongly perspectivist model for supervised classification of natural language utterances. Our approach combines the predictions of several perspective-aware models using key information of their individual confidence to capture the subjectivity encoded in the annotation of linguistic phenomena. We validate our method through experiments on two case studies, irony and hate speech detection, in in-domain and cross-domain settings. The results show that confidence-based ensembling of perspective-aware models seems beneficial for classification performance in all scenarios. In addition, we demonstrate the effectiveness of our method with automatically extracted perspectives from annotations when the annotators’ metadata are not available.
Anthology ID:
2023.emnlp-main.212
Volume:
Proceedings of the 2023 Conference on Empirical Methods in Natural Language Processing
Month:
December
Year:
2023
Address:
Singapore
Editors:
Houda Bouamor, Juan Pino, Kalika Bali
Venue:
EMNLP
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
3496–3507
Language:
URL:
https://aclanthology.org/2023.emnlp-main.212
DOI:
10.18653/v1/2023.emnlp-main.212
Bibkey:
Cite (ACL):
Silvia Casola, Soda Lo, Valerio Basile, Simona Frenda, Alessandra Cignarella, Viviana Patti, and Cristina Bosco. 2023. Confidence-based Ensembling of Perspective-aware Models. In Proceedings of the 2023 Conference on Empirical Methods in Natural Language Processing, pages 3496–3507, Singapore. Association for Computational Linguistics.
Cite (Informal):
Confidence-based Ensembling of Perspective-aware Models (Casola et al., EMNLP 2023)
Copy Citation:
PDF:
https://aclanthology.org/2023.emnlp-main.212.pdf
Video:
 https://aclanthology.org/2023.emnlp-main.212.mp4