Anastasia Martynova
2025
Learn Together: Joint Multitask Finetuning of Pretrained KG-enhanced LLM for Downstream Tasks
Anastasia Martynova
|
Vladislav Tishin
|
Natalia Semenova
Proceedings of the Workshop on Generative AI and Knowledge Graphs (GenAIK)
Recent studies have shown that a knowledge graph (KG) can enhance text data by providing structured background knowledge, which can significantly improve the language understanding skills of the LLM. Besides, finetuning of such models shows solid results on commonsense reasoning benchmarks. In this work, we introduce expandable Joint Multitask Finetuning on Pretrained KG-enchanced LLM approach for Question Answering (QA), Machine Reading Comprehension (MRC) and Knowledge Graph Question Answering (KGQA) tasks. Extensive experiments show competitive performance of joint finetuning QA+MRC+KGQA over single task approach with a maximum gain of 30% accuracy.