Can Data Diversity Enhance Learning Generalization?

Yu Yu, Shahram Khadivi, Jia Xu


Abstract
This paper introduces our Diversity Advanced Actor-Critic reinforcement learning (A2C) framework (DAAC) to improve the generalization and accuracy of Natural Language Processing (NLP). We show that the diversification of training samples alleviates overfitting and improves model generalization and accuracy. We quantify diversity on a set of samples using the max dispersion, convex hull volume, and graph entropy based on sentence embeddings in high-dimensional metric space. We also introduce A2C to select such a diversified training subset efficiently. Our experiments achieve up to +23.8 accuracy increase (38.0% relatively) in sentiment analysis, -44.7 perplexity decrease (37.9% relatively) in language modeling, and consistent improvements in named entity recognition over various domains. In particular, our method outperforms both domain adaptation and generalization baselines without using any target domain knowledge.
Anthology ID:
2022.coling-1.437
Volume:
Proceedings of the 29th International Conference on Computational Linguistics
Month:
October
Year:
2022
Address:
Gyeongju, Republic of Korea
Editors:
Nicoletta Calzolari, Chu-Ren Huang, Hansaem Kim, James Pustejovsky, Leo Wanner, Key-Sun Choi, Pum-Mo Ryu, Hsin-Hsi Chen, Lucia Donatelli, Heng Ji, Sadao Kurohashi, Patrizia Paggio, Nianwen Xue, Seokhwan Kim, Younggyun Hahm, Zhong He, Tony Kyungil Lee, Enrico Santus, Francis Bond, Seung-Hoon Na
Venue:
COLING
SIG:
Publisher:
International Committee on Computational Linguistics
Note:
Pages:
4933–4945
Language:
URL:
https://aclanthology.org/2022.coling-1.437
DOI:
Bibkey:
Cite (ACL):
Yu Yu, Shahram Khadivi, and Jia Xu. 2022. Can Data Diversity Enhance Learning Generalization?. In Proceedings of the 29th International Conference on Computational Linguistics, pages 4933–4945, Gyeongju, Republic of Korea. International Committee on Computational Linguistics.
Cite (Informal):
Can Data Diversity Enhance Learning Generalization? (Yu et al., COLING 2022)
Copy Citation:
PDF:
https://aclanthology.org/2022.coling-1.437.pdf
Data
WikiText-2