Compositional Generalization with Grounded Language Models

Sondre Wold, Étienne Simon, Lucas Charpentier, Egor Kostylev, Erik Velldal, Lilja Øvrelid


Abstract
Grounded language models use external sources of information, such as knowledge graphs, to meet some of the general challenges associated with pre-training. By extending previous work on compositional generalization in semantic parsing, we allow for a controlled evaluation of the degree to which these models learn and generalize from patterns in knowledge graphs. We develop a procedure for generating natural language questions paired with knowledge graphs that targets different aspects of compositionality and further avoids grounding the language models in information already encoded implicitly in their weights. We evaluate existing methods for combining language models with knowledge graphs and find them to struggle with generalization to sequences of unseen lengths and to novel combinations of seen base components. While our experimental results provide some insight into the expressive power of these models, we hope our work and released datasets motivate future research on how to better combine language models with structured knowledge representations.
Anthology ID:
2024.findings-acl.205
Volume:
Findings of the Association for Computational Linguistics ACL 2024
Month:
August
Year:
2024
Address:
Bangkok, Thailand and virtual meeting
Editors:
Lun-Wei Ku, Andre Martins, Vivek Srikumar
Venue:
Findings
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
3447–3460
Language:
URL:
https://aclanthology.org/2024.findings-acl.205
DOI:
Bibkey:
Cite (ACL):
Sondre Wold, Étienne Simon, Lucas Charpentier, Egor Kostylev, Erik Velldal, and Lilja Øvrelid. 2024. Compositional Generalization with Grounded Language Models. In Findings of the Association for Computational Linguistics ACL 2024, pages 3447–3460, Bangkok, Thailand and virtual meeting. Association for Computational Linguistics.
Cite (Informal):
Compositional Generalization with Grounded Language Models (Wold et al., Findings 2024)
Copy Citation:
PDF:
https://aclanthology.org/2024.findings-acl.205.pdf