Using counterfactual contrast to improve compositional generalization for multi-step quantitative reasoning

Armineh Nourbakhsh, Sameena Shah, Carolyn Rosé


Abstract
In quantitative question answering, compositional generalization is one of the main challenges of state of the art models, especially when longer sequences of reasoning steps are required. In this paper we propose CounterComp, a method that uses counterfactual scenarios to generate samples with compositional contrast. Instead of a data augmentation approach, CounterComp is based on metric learning, which allows for direct sampling from the training set and circumvents the need for additional human labels. Our proposed auxiliary metric learning loss improves the performance of three state of the art models on four recently released datasets. We also show how the approach can improve OOD performance on unseen domains, as well as unseen compositions. Lastly, we demonstrate how the method can lead to better compositional attention patterns during training.
Anthology ID:
2023.acl-long.834
Volume:
Proceedings of the 61st Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers)
Month:
July
Year:
2023
Address:
Toronto, Canada
Editors:
Anna Rogers, Jordan Boyd-Graber, Naoaki Okazaki
Venue:
ACL
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
14930–14943
Language:
URL:
https://aclanthology.org/2023.acl-long.834
DOI:
10.18653/v1/2023.acl-long.834
Bibkey:
Cite (ACL):
Armineh Nourbakhsh, Sameena Shah, and Carolyn Rosé. 2023. Using counterfactual contrast to improve compositional generalization for multi-step quantitative reasoning. In Proceedings of the 61st Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers), pages 14930–14943, Toronto, Canada. Association for Computational Linguistics.
Cite (Informal):
Using counterfactual contrast to improve compositional generalization for multi-step quantitative reasoning (Nourbakhsh et al., ACL 2023)
Copy Citation:
PDF:
https://aclanthology.org/2023.acl-long.834.pdf
Video:
 https://aclanthology.org/2023.acl-long.834.mp4