Sharpness-Aware Minimization Improves Language Model Generalization

Dara Bahri, Hossein Mobahi, Yi Tay


Abstract
The allure of superhuman-level capabilities has led to considerable interest in language models like GPT-3 and T5, wherein the research has, by and large, revolved around new model architectures, training tasks, and loss objectives, along with substantial engineering efforts to scale up model capacity and dataset size. Comparatively little work has been done to improve the generalization of these models through better optimization. In this work, we show that Sharpness-Aware Minimization (SAM), a recently proposed optimization procedure that encourages convergence to flatter minima, can substantially improve the generalization of language models without much computational overhead. We show that SAM is able to boost performance on SuperGLUE, GLUE, Web Questions, Natural Questions, Trivia QA, and TyDiQA, with particularly large gains when training data for these tasks is limited.
Anthology ID:
2022.acl-long.508
Volume:
Proceedings of the 60th Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers)
Month:
May
Year:
2022
Address:
Dublin, Ireland
Editors:
Smaranda Muresan, Preslav Nakov, Aline Villavicencio
Venue:
ACL
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
7360–7371
Language:
URL:
https://aclanthology.org/2022.acl-long.508
DOI:
10.18653/v1/2022.acl-long.508
Bibkey:
Cite (ACL):
Dara Bahri, Hossein Mobahi, and Yi Tay. 2022. Sharpness-Aware Minimization Improves Language Model Generalization. In Proceedings of the 60th Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers), pages 7360–7371, Dublin, Ireland. Association for Computational Linguistics.
Cite (Informal):
Sharpness-Aware Minimization Improves Language Model Generalization (Bahri et al., ACL 2022)
Copy Citation:
PDF:
https://aclanthology.org/2022.acl-long.508.pdf
Video:
 https://aclanthology.org/2022.acl-long.508.mp4
Data
GLUENatural QuestionsSuperGLUETriviaQATyDiQATyDiQA-GoldPWebQuestions