AcKnowledge: Acquired Knowledge Representation by Small Language Model Without Pre-training

Sourav Das, Sanjay Chatterji, Imon Mukherjee


Abstract
Large language models (LLMs) are pre-trained on enormous amounts of text data and show acclaimed success in knowledge representation. However, there are two bottlenecks with this approach. (1) Pre-training data cannot be regularly updated once the models are deployed, and it is not very fruitful if the model cannot represent updated knowledge. (2) The consistently increasing size and computational resources make it difficult for non-commercial and individual researchers to fine-tune and scale these language models. Major LLMs with external knowledge are also proprietary. In this paper, we propose AcKnowledge, a framework wrapped around a small, non-pre-trained language model for an open-domain question-answering (QA) experiment. AcKnowledge learns relevant knowledge from the internet via meta-learning based on user questions, and re-learns from user feedback if knowledge is misrepresented. Our efficient knowledge representation framework avoids pre-training overhead while enabling updated information. Benchmarking shows competitive performance against similarly sized state-of-the-art (SoTA) LLMs on gold standard QA datasets, demonstrating the potential of integrating internet search and user feedback for improved performance and generalizability.
Anthology ID:
2024.knowllm-1.8
Volume:
Proceedings of the 1st Workshop on Towards Knowledgeable Language Models (KnowLLM 2024)
Month:
August
Year:
2024
Address:
Bangkok, Thailand
Editors:
Sha Li, Manling Li, Michael JQ Zhang, Eunsol Choi, Mor Geva, Peter Hase, Heng Ji
Venues:
KnowLLM | WS
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
83–95
Language:
URL:
https://aclanthology.org/2024.knowllm-1.8
DOI:
Bibkey:
Cite (ACL):
Sourav Das, Sanjay Chatterji, and Imon Mukherjee. 2024. AcKnowledge: Acquired Knowledge Representation by Small Language Model Without Pre-training. In Proceedings of the 1st Workshop on Towards Knowledgeable Language Models (KnowLLM 2024), pages 83–95, Bangkok, Thailand. Association for Computational Linguistics.
Cite (Informal):
AcKnowledge: Acquired Knowledge Representation by Small Language Model Without Pre-training (Das et al., KnowLLM-WS 2024)
Copy Citation:
PDF:
https://aclanthology.org/2024.knowllm-1.8.pdf