Diego Ortiz
2022
Exploring the Value of Multi-View Learning for Session-Aware Query Representation
Diego Ortiz
|
Jose Moreno
|
Gilles Hubert
|
Karen Pinel-Sauvagnat
|
Lynda Tamine
Findings of the Association for Computational Linguistics: NAACL 2022
Recent years have witnessed a growing interest towards learning distributed query representations that are able to capture search intent semantics. Most existing approaches learn query embeddings using relevance supervision making them suited only to document ranking tasks. Besides, they generally consider either user’s query reformulations or system’s rankings whereas previous findings show that user’s query behavior and knowledge change depending on the system’s results, intertwine and affect each other during the completion of a search task. In this paper, we explore the value of multi-view learning for generic and unsupervised session-aware query representation learning. First, single-view query embeddings are obtained in separate spaces from query reformulations and document ranking representations using transformers. Then, we investigate the use of linear (CCA) and non linear (UMAP) multi-view learning methods, to align those spaces with the aim of revealing similarity traits in the multi-view shared space. Experimental evaluation is carried out in a query classification and session-based retrieval downstream tasks using respectively the KDD and TREC session datasets. The results show that multi-view learning is an effective and controllable approach for unsupervised learning of generic query representations and can reflect search behavior patterns.
Search