ACTOR: Active Learning with Annotator-specific Classification Heads to Embrace Human Label Variation

Xinpeng Wang, Barbara Plank


Abstract
Label aggregation such as majority voting is commonly used to resolve annotator disagreement in dataset creation. However, this may disregard minority values and opinions. Recent studies indicate that learning from individual annotations outperforms learning from aggregated labels, though they require a considerable amount of annotation. Active learning, as an annotation cost-saving strategy, has not been fully explored in the context of learning from disagreement. We show that in the active learning setting, a multi-head model performs significantly better than a single-head model in terms of uncertainty estimation. By designing and evaluating acquisition functions with annotator-specific heads on two datasets, we show that group-level entropy works generally well on both datasets. Importantly, it achieves performance in terms of both prediction and uncertainty estimation comparable to full-scale training from disagreement, while saving 70% of the annotation budget.
Anthology ID:
2023.emnlp-main.126
Volume:
Proceedings of the 2023 Conference on Empirical Methods in Natural Language Processing
Month:
December
Year:
2023
Address:
Singapore
Editors:
Houda Bouamor, Juan Pino, Kalika Bali
Venue:
EMNLP
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
2046–2052
Language:
URL:
https://aclanthology.org/2023.emnlp-main.126
DOI:
10.18653/v1/2023.emnlp-main.126
Bibkey:
Cite (ACL):
Xinpeng Wang and Barbara Plank. 2023. ACTOR: Active Learning with Annotator-specific Classification Heads to Embrace Human Label Variation. In Proceedings of the 2023 Conference on Empirical Methods in Natural Language Processing, pages 2046–2052, Singapore. Association for Computational Linguistics.
Cite (Informal):
ACTOR: Active Learning with Annotator-specific Classification Heads to Embrace Human Label Variation (Wang & Plank, EMNLP 2023)
Copy Citation:
PDF:
https://aclanthology.org/2023.emnlp-main.126.pdf
Video:
 https://aclanthology.org/2023.emnlp-main.126.mp4