Rethinking Label Smoothing on Multi-hop Question Answering

Yin Zhangyue, Wang Yuxin, Hu Xiannian, Wu Yiguang, Yan Hang, Zhang Xinyu, Cao Zhao, Huang Xuanjing, Qiu Xipeng


Abstract
“Multi-Hop Question Answering (MHQA) is a significant area in question answering, requiringmultiple reasoning components, including document retrieval, supporting sentence prediction,and answer span extraction. In this work, we present the first application of label smoothing tothe MHQA task, aiming to enhance generalization capabilities in MHQA systems while miti-gating overfitting of answer spans and reasoning paths in the training set. We introduce a novellabel smoothing technique, F1 Smoothing, which incorporates uncertainty into the learning pro-cess and is specifically tailored for Machine Reading Comprehension (MRC) tasks. Moreover,we employ a Linear Decay Label Smoothing Algorithm (LDLA) in conjunction with curricu-lum learning to progressively reduce uncertainty throughout the training process. Experimenton the HotpotQA dataset confirms the effectiveness of our approach in improving generaliza-tion and achieving significant improvements, leading to new state-of-the-art performance on theHotpotQA leaderboard.”
Anthology ID:
2023.ccl-1.53
Volume:
Proceedings of the 22nd Chinese National Conference on Computational Linguistics
Month:
August
Year:
2023
Address:
Harbin, China
Editors:
Maosong Sun, Bing Qin, Xipeng Qiu, Jing Jiang, Xianpei Han
Venue:
CCL
SIG:
Publisher:
Chinese Information Processing Society of China
Note:
Pages:
611–623
Language:
English
URL:
https://aclanthology.org/2023.ccl-1.53
DOI:
Bibkey:
Cite (ACL):
Yin Zhangyue, Wang Yuxin, Hu Xiannian, Wu Yiguang, Yan Hang, Zhang Xinyu, Cao Zhao, Huang Xuanjing, and Qiu Xipeng. 2023. Rethinking Label Smoothing on Multi-hop Question Answering. In Proceedings of the 22nd Chinese National Conference on Computational Linguistics, pages 611–623, Harbin, China. Chinese Information Processing Society of China.
Cite (Informal):
Rethinking Label Smoothing on Multi-hop Question Answering (Zhangyue et al., CCL 2023)
Copy Citation:
PDF:
https://aclanthology.org/2023.ccl-1.53.pdf