BRENT: Bidirectional Retrieval Enhanced Norwegian Transformer

Lucas Charpentier, Sondre Wold, David Samuel, Egil Rønningstad


Abstract
Retrieval-based language models are increasingly employed in question-answering tasks. These models search in a corpus of documents for relevant information instead of having all factual knowledge stored in its parameters, thereby enhancing efficiency, transparency, and adaptability. We develop the first Norwegian retrieval-based model by adapting the REALM framework and evaluate it on various tasks. After training, we also separate the language model, which we call the reader, from the retriever components, and show that this can be fine-tuned on a range of downstream tasks. Results show that retrieval augmented language modeling improves the reader’s performance on extractive question-answering, suggesting that this type of training improves language models’ general ability to use context and that this does not happen at the expense of other abilities such as part-of-speech tagging, dependency parsing, named entity recognition, and lemmatization. Code, trained models, and data are made publicly available.
Anthology ID:
2023.nodalida-1.21
Volume:
Proceedings of the 24th Nordic Conference on Computational Linguistics (NoDaLiDa)
Month:
May
Year:
2023
Address:
Tórshavn, Faroe Islands
Editors:
Tanel Alumäe, Mark Fishel
Venue:
NoDaLiDa
SIG:
Publisher:
University of Tartu Library
Note:
Pages:
202–214
Language:
URL:
https://aclanthology.org/2023.nodalida-1.21
DOI:
Bibkey:
Cite (ACL):
Lucas Charpentier, Sondre Wold, David Samuel, and Egil Rønningstad. 2023. BRENT: Bidirectional Retrieval Enhanced Norwegian Transformer. In Proceedings of the 24th Nordic Conference on Computational Linguistics (NoDaLiDa), pages 202–214, Tórshavn, Faroe Islands. University of Tartu Library.
Cite (Informal):
BRENT: Bidirectional Retrieval Enhanced Norwegian Transformer (Charpentier et al., NoDaLiDa 2023)
Copy Citation:
PDF:
https://aclanthology.org/2023.nodalida-1.21.pdf