Learn with Noisy Data via Unsupervised Loss Correction for Weakly Supervised Reading Comprehension

Xuemiao Zhang, Kun Zhou, Sirui Wang, Fuzheng Zhang, Zhongyuan Wang, Junfei Liu


Abstract
Weakly supervised machine reading comprehension (MRC) task is practical and promising for its easily available and massive training data, but inevitablely introduces noise. Existing related methods usually incorporate extra submodels to help filter noise before the noisy data is input to main models. However, these multistage methods often make training difficult, and the qualities of submodels are hard to be controlled. In this paper, we first explore and analyze the essential characteristics of noise from the perspective of loss distribution, and find that in the early stage of training, noisy samples usually lead to significantly larger loss values than clean ones. Based on the observation, we propose a hierarchical loss correction strategy to avoid fitting noise and enhance clean supervision signals, including using an unsupervisedly fitted Gaussian mixture model to calculate the weight factors for all losses to correct the loss distribution, and employ a hard bootstrapping loss to modify loss function. Experimental results on different weakly supervised MRC datasets show that the proposed methods can help improve models significantly.
Anthology ID:
2020.coling-main.236
Volume:
Proceedings of the 28th International Conference on Computational Linguistics
Month:
December
Year:
2020
Address:
Barcelona, Spain (Online)
Editors:
Donia Scott, Nuria Bel, Chengqing Zong
Venue:
COLING
SIG:
Publisher:
International Committee on Computational Linguistics
Note:
Pages:
2624–2634
Language:
URL:
https://aclanthology.org/2020.coling-main.236
DOI:
10.18653/v1/2020.coling-main.236
Bibkey:
Cite (ACL):
Xuemiao Zhang, Kun Zhou, Sirui Wang, Fuzheng Zhang, Zhongyuan Wang, and Junfei Liu. 2020. Learn with Noisy Data via Unsupervised Loss Correction for Weakly Supervised Reading Comprehension. In Proceedings of the 28th International Conference on Computational Linguistics, pages 2624–2634, Barcelona, Spain (Online). International Committee on Computational Linguistics.
Cite (Informal):
Learn with Noisy Data via Unsupervised Loss Correction for Weakly Supervised Reading Comprehension (Zhang et al., COLING 2020)
Copy Citation:
PDF:
https://aclanthology.org/2020.coling-main.236.pdf
Data
SQuADTriviaQA