Kento Tatsuno
2023
RaLLe: A Framework for Developing and Evaluating Retrieval-Augmented Large Language Models
Yasuto Hoshi
|
Daisuke Miyashita
|
Youyang Ng
|
Kento Tatsuno
|
Yasuhiro Morioka
|
Osamu Torii
|
Jun Deguchi
Proceedings of the 2023 Conference on Empirical Methods in Natural Language Processing: System Demonstrations
Retrieval-augmented large language models (R-LLMs) combine pre-trained large language models (LLMs) with information retrieval systems to improve the accuracy of factual question-answering. However, current libraries for building R-LLMs provide high-level abstractions without sufficient transparency for evaluating and optimizing prompts within specific inference processes such as retrieval and generation. To address this gap, we present RaLLe, an open-source framework designed to facilitate the development, evaluation, and optimization of R-LLMs for knowledge-intensive tasks. With RaLLe, developers can easily develop and evaluate R-LLMs, improving hand-crafted prompts, assessing individual inference processes, and objectively measuring overall system performance quantitatively. By leveraging these features, developers can enhance the performance and accuracy of their R-LLMs in knowledge-intensive generation tasks.
Search
Co-authors
- Yasuto Hoshi 1
- Daisuke Miyashita 1
- Youyang Ng 1
- Yasuhiro Morioka 1
- Osamu Torii 1
- show all...