QAVSA: Question Answering using Vector Symbolic Algebras

Ryan Laube, Chris Eliasmith


Abstract
With the advancement of large pretrained language models (PLMs), many question answering (QA) benchmarks have been developed in order to evaluate the reasoning capabilities of these models. Augmenting PLMs with external knowledge in the form of Knowledge Graphs (KGs) has been a popular method to improve their reasoning capabilities, and a common method to reason over KGs is to use Graph Neural Networks (GNNs). As an alternative to GNNs to augment PLMs, we propose a novel graph reasoning module using Vector Symbolic Algebra (VSA) graph representations and a k-layer MLP. We demonstrate that our VSA-based model performs as well as QA-GNN, a model combining a PLM and a GNN-module, on 3 multiple-choice question answering (MCQA) datasets. Our model has a simpler architecture than QA-GNN and also converges 39% faster during training.
Anthology ID:
2024.repl4nlp-1.14
Volume:
Proceedings of the 9th Workshop on Representation Learning for NLP (RepL4NLP-2024)
Month:
August
Year:
2024
Address:
Bangkok, Thailand
Editors:
Chen Zhao, Marius Mosbach, Pepa Atanasova, Seraphina Goldfarb-Tarrent, Peter Hase, Arian Hosseini, Maha Elbayad, Sandro Pezzelle, Maximilian Mozes
Venues:
RepL4NLP | WS
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
191–202
Language:
URL:
https://aclanthology.org/2024.repl4nlp-1.14
DOI:
Bibkey:
Cite (ACL):
Ryan Laube and Chris Eliasmith. 2024. QAVSA: Question Answering using Vector Symbolic Algebras. In Proceedings of the 9th Workshop on Representation Learning for NLP (RepL4NLP-2024), pages 191–202, Bangkok, Thailand. Association for Computational Linguistics.
Cite (Informal):
QAVSA: Question Answering using Vector Symbolic Algebras (Laube & Eliasmith, RepL4NLP-WS 2024)
Copy Citation:
PDF:
https://aclanthology.org/2024.repl4nlp-1.14.pdf