Ren Min
2025
RJAG: Retrieval Judgment Augmented Generation
Kuangzhi Wang | Huzhenhua Huzhenhua | Ren Min | Xiangzhi Tao
Proceedings of the 24th China National Conference on Computational Linguistics (CCL 2025)
Kuangzhi Wang | Huzhenhua Huzhenhua | Ren Min | Xiangzhi Tao
Proceedings of the 24th China National Conference on Computational Linguistics (CCL 2025)
"Large Language Models (LLMs) inevitably suffer from hallucinations, as relying solely on their parametric knowledge cannot guarantee the accuracy of generated content. To enhance text generation, retrieval-augmented generation (RAG) is proposed to incorporate external knowledge to achieve this. However, its effectiveness heavily depends on the relevance of retrieved documents, which poses a critical challenge: how to ensure the accuracy and reliability of model responses when retrieval results are inaccurate. Tackling this challenge, we propose RetrievalJudgment Augmented Generation (RJAG), a method that can enhance RAG through LLM-driven fine-grained relevance judgment mechanism and a task-adaptive knowledge combination strategy. RJAG judges and dynamically combines retrieved documents for both open-ended generation and closed-ended selection tasks. Additionally, large-scale web search is also included to expand the knowledge beyond static corpora. Experimental results on multiple bench-marks show that RJAG outperforms existing RAG methods, which will significantly enhance the accuracy and reliability while maintaining the system’s simplicity. Code is available at https://github.com/wangkz2023/RJAG."