SSP: Self-Supervised Post-training for Conversational Search

Quan Tu, Shen Gao, Xiaolong Wu, Zhao Cao, Ji-Rong Wen, Rui Yan


Abstract
Conversational search has been regarded as the next-generation search paradigm. Constrained by data scarcity, most existing methods distill the well-trained ad-hoc retriever to the conversational retriever. However, these methods, which usually initialize parameters by query reformulation to discover contextualized dependency, have trouble in understanding the dialogue structure information and struggle with contextual semantic vanishing. In this paper, we propose {pasted macro ‘FULLMODEL’} ({pasted macro ‘MODEL’}) which is a new post-training paradigm with three self-supervised tasks to efficiently initialize the conversational search model to enhance the dialogue structure and contextual semantic understanding. Furthermore, the {pasted macro ‘MODEL’} can be plugged into most of the existing conversational models to boost their performance. To verify the effectiveness of our proposed method, we apply the conversational encoder post-trained by {pasted macro ‘MODEL’} on the conversational search task using two benchmark datasets: CAsT-19 and CAsT-20.Extensive experiments that our {pasted macro ‘MODEL’} can boost the performance of several existing conversational search methods. Our source code is available at https://github.com/morecry/SSP.
Anthology ID:
2023.findings-acl.837
Volume:
Findings of the Association for Computational Linguistics: ACL 2023
Month:
July
Year:
2023
Address:
Toronto, Canada
Editors:
Anna Rogers, Jordan Boyd-Graber, Naoaki Okazaki
Venue:
Findings
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
13237–13249
Language:
URL:
https://aclanthology.org/2023.findings-acl.837
DOI:
10.18653/v1/2023.findings-acl.837
Bibkey:
Cite (ACL):
Quan Tu, Shen Gao, Xiaolong Wu, Zhao Cao, Ji-Rong Wen, and Rui Yan. 2023. SSP: Self-Supervised Post-training for Conversational Search. In Findings of the Association for Computational Linguistics: ACL 2023, pages 13237–13249, Toronto, Canada. Association for Computational Linguistics.
Cite (Informal):
SSP: Self-Supervised Post-training for Conversational Search (Tu et al., Findings 2023)
Copy Citation:
PDF:
https://aclanthology.org/2023.findings-acl.837.pdf