Contrastive Pre-training for Personalized Expert Finding

Qiyao Peng, Hongtao Liu, Zhepeng Lv, Qing Yang, Wenjun Wang


Abstract
Expert finding could help route questions to potential suitable users to answer in Community Question Answering (CQA) platforms. Hence it is essential to learn accurate representations of experts and questions according to the question text articles. Recently the pre-training and fine-tuning paradigms are powerful for natural language understanding, which has the potential for better question modeling and expert finding. Inspired by this, we propose a CQA-domain Contrastive Pre-training framework for Expert Finding, named CPEF, which could learn more comprehensive question representations. Specifically, considering that there is semantic complementation between question titles and bodies, during the domain pre-training phase, we propose a title-body contrastive learning task to enhance question representations, which directly treats the question title and the corresponding body as positive samples of each other, instead of designing extra data-augmentation strategies. Furthermore, a personalized tuning network is proposed to inject the personalized preferences of different experts during the fine-tuning phase. Extensive experimental results on six real-world datasets demonstrate that our method could achieve superior performance for expert finding.
Anthology ID:
2023.findings-emnlp.1058
Volume:
Findings of the Association for Computational Linguistics: EMNLP 2023
Month:
December
Year:
2023
Address:
Singapore
Editors:
Houda Bouamor, Juan Pino, Kalika Bali
Venue:
Findings
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
15797–15806
Language:
URL:
https://aclanthology.org/2023.findings-emnlp.1058
DOI:
10.18653/v1/2023.findings-emnlp.1058
Bibkey:
Cite (ACL):
Qiyao Peng, Hongtao Liu, Zhepeng Lv, Qing Yang, and Wenjun Wang. 2023. Contrastive Pre-training for Personalized Expert Finding. In Findings of the Association for Computational Linguistics: EMNLP 2023, pages 15797–15806, Singapore. Association for Computational Linguistics.
Cite (Informal):
Contrastive Pre-training for Personalized Expert Finding (Peng et al., Findings 2023)
Copy Citation:
PDF:
https://aclanthology.org/2023.findings-emnlp.1058.pdf