HiddenCut: Simple Data Augmentation for Natural Language Understanding with Better Generalizability

Jiaao Chen, Dinghan Shen, Weizhu Chen, Diyi Yang


Abstract
Fine-tuning large pre-trained models with task-specific data has achieved great success in NLP. However, it has been demonstrated that the majority of information within the self-attention networks is redundant and not utilized effectively during the fine-tuning stage. This leads to inferior results when generalizing the obtained models to out-of-domain distributions. To this end, we propose a simple yet effective data augmentation technique, HiddenCut, to better regularize the model and encourage it to learn more generalizable features. Specifically, contiguous spans within the hidden space are dynamically and strategically dropped during training. Experiments show that our HiddenCut method outperforms the state-of-the-art augmentation methods on the GLUE benchmark, and consistently exhibits superior generalization performances on out-of-distribution and challenging counterexamples. We have publicly released our code at https://github.com/GT-SALT/HiddenCut.
Anthology ID:
2021.acl-long.338
Volume:
Proceedings of the 59th Annual Meeting of the Association for Computational Linguistics and the 11th International Joint Conference on Natural Language Processing (Volume 1: Long Papers)
Month:
August
Year:
2021
Address:
Online
Editors:
Chengqing Zong, Fei Xia, Wenjie Li, Roberto Navigli
Venues:
ACL | IJCNLP
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
4380–4390
Language:
URL:
https://aclanthology.org/2021.acl-long.338
DOI:
10.18653/v1/2021.acl-long.338
Bibkey:
Cite (ACL):
Jiaao Chen, Dinghan Shen, Weizhu Chen, and Diyi Yang. 2021. HiddenCut: Simple Data Augmentation for Natural Language Understanding with Better Generalizability. In Proceedings of the 59th Annual Meeting of the Association for Computational Linguistics and the 11th International Joint Conference on Natural Language Processing (Volume 1: Long Papers), pages 4380–4390, Online. Association for Computational Linguistics.
Cite (Informal):
HiddenCut: Simple Data Augmentation for Natural Language Understanding with Better Generalizability (Chen et al., ACL-IJCNLP 2021)
Copy Citation:
PDF:
https://aclanthology.org/2021.acl-long.338.pdf
Video:
 https://aclanthology.org/2021.acl-long.338.mp4
Code
 GT-SALT/HiddenCut
Data
GLUEIMDb Movie ReviewsQNLI