Harnessing Dataset Cartography for Improved Compositional Generalization in Transformers

Osman İnce, Tanin Zeraati, Semih Yagcioglu, Yadollah Yaghoobzadeh, Erkut Erdem, Aykut Erdem


Abstract
Neural networks have revolutionized language modeling and excelled in various downstream tasks. However, the extent to which these models achieve compositional generalization comparable to human cognitive abilities remains a topic of debate. While existing approaches in the field have mainly focused on novel architectures and alternative learning paradigms, we introduce a pioneering method harnessing the power of dataset cartography (Swayamdipta et al., 2020). By strategically identifying a subset of compositional generalization data using this approach, we achieve a remarkable improvement in model accuracy, yielding enhancements of up to 10% on CFQ and COGS datasets. Notably, our technique incorporates dataset cartography as a curriculum learning criterion, eliminating the need for hyperparameter tuning while consistently achieving superior performance. Our findings highlight the untapped potential of dataset cartography in unleashing the full capabilities of compositional generalization within Transformer models.
Anthology ID:
2023.findings-emnlp.867
Volume:
Findings of the Association for Computational Linguistics: EMNLP 2023
Month:
December
Year:
2023
Address:
Singapore
Editors:
Houda Bouamor, Juan Pino, Kalika Bali
Venue:
Findings
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
13023–13041
Language:
URL:
https://aclanthology.org/2023.findings-emnlp.867
DOI:
10.18653/v1/2023.findings-emnlp.867
Bibkey:
Cite (ACL):
Osman İnce, Tanin Zeraati, Semih Yagcioglu, Yadollah Yaghoobzadeh, Erkut Erdem, and Aykut Erdem. 2023. Harnessing Dataset Cartography for Improved Compositional Generalization in Transformers. In Findings of the Association for Computational Linguistics: EMNLP 2023, pages 13023–13041, Singapore. Association for Computational Linguistics.
Cite (Informal):
Harnessing Dataset Cartography for Improved Compositional Generalization in Transformers (İnce et al., Findings 2023)
Copy Citation:
PDF:
https://aclanthology.org/2023.findings-emnlp.867.pdf