Meta Self-Refinement for Robust Learning with Weak Supervision

Dawei Zhu, Xiaoyu Shen, Michael Hedderich, Dietrich Klakow


Abstract
Training deep neural networks (DNNs) under weak supervision has attracted increasing research attention as it can significantly reduce the annotation cost. However, labels from weak supervision can be noisy, and the high capacity of DNNs enables them to easily overfit the label noise, resulting in poor generalization. Recent methods leverage self-training to build noise-resistant models, in which a teacher trained under weak supervision is used to provide highly confident labels for teaching the students. Nevertheless, the teacher derived from such frameworks may have fitted a substantial amount of noise and therefore produce incorrect pseudo-labels with high confidence, leading to severe error propagation. In this work, we propose Meta Self-Refinement (MSR), a noise-resistant learning framework, to effectively combat label noise from weak supervision. Instead of relying on a fixed teacher trained with noisy labels, we encourage the teacher to refine its pseudo-labels. At each training step, MSR performs a meta gradient descent on the current mini-batch to maximize the student performance on a clean validation set. Extensive experimentation on eight NLP benchmarks demonstrates that MSR is robust against label noise in all settings and outperforms state-of-the-art methods by up to 11.4% in accuracy and 9.26% in F1 score.
Anthology ID:
2023.eacl-main.74
Volume:
Proceedings of the 17th Conference of the European Chapter of the Association for Computational Linguistics
Month:
May
Year:
2023
Address:
Dubrovnik, Croatia
Editors:
Andreas Vlachos, Isabelle Augenstein
Venue:
EACL
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
1043–1058
Language:
URL:
https://aclanthology.org/2023.eacl-main.74
DOI:
10.18653/v1/2023.eacl-main.74
Bibkey:
Cite (ACL):
Dawei Zhu, Xiaoyu Shen, Michael Hedderich, and Dietrich Klakow. 2023. Meta Self-Refinement for Robust Learning with Weak Supervision. In Proceedings of the 17th Conference of the European Chapter of the Association for Computational Linguistics, pages 1043–1058, Dubrovnik, Croatia. Association for Computational Linguistics.
Cite (Informal):
Meta Self-Refinement for Robust Learning with Weak Supervision (Zhu et al., EACL 2023)
Copy Citation:
PDF:
https://aclanthology.org/2023.eacl-main.74.pdf
Video:
 https://aclanthology.org/2023.eacl-main.74.mp4