Geo-BERT Pre-training Model for Query Rewriting in POI Search

Xiao Liu, Juan Hu, Qi Shen, Huan Chen


Abstract
Query Rewriting (QR) is proposed to solve the problem of the word mismatch between queries and documents in Web search. Existing approaches usually model QR with an end-to-end sequence-to-sequence (seq2seq) model. The state-of-the-art Transformer-based models can effectively learn textual semantics from user session logs, but they often ignore users’ geographic location information that is crucial for the Point-of-Interest (POI) search of map services. In this paper, we proposed a pre-training model, called Geo-BERT, to integrate semantics and geographic information in the pre-trained representations of POIs. Firstly, we simulate POI distribution in the real world as a graph, in which nodes represent POIs and multiple geographic granularities. Then we use graph representation learning methods to get geographic representations. Finally, we train a BERT-like pre-training model with text and POIs’ graph embeddings to get an integrated representation of both geographic and semantic information, and apply it in the QR of POI search. The proposed model achieves excellent accuracy on a wide range of real-world datasets of map services.
Anthology ID:
2021.findings-emnlp.190
Volume:
Findings of the Association for Computational Linguistics: EMNLP 2021
Month:
November
Year:
2021
Address:
Punta Cana, Dominican Republic
Editors:
Marie-Francine Moens, Xuanjing Huang, Lucia Specia, Scott Wen-tau Yih
Venue:
Findings
SIG:
SIGDAT
Publisher:
Association for Computational Linguistics
Note:
Pages:
2209–2214
Language:
URL:
https://aclanthology.org/2021.findings-emnlp.190
DOI:
10.18653/v1/2021.findings-emnlp.190
Bibkey:
Cite (ACL):
Xiao Liu, Juan Hu, Qi Shen, and Huan Chen. 2021. Geo-BERT Pre-training Model for Query Rewriting in POI Search. In Findings of the Association for Computational Linguistics: EMNLP 2021, pages 2209–2214, Punta Cana, Dominican Republic. Association for Computational Linguistics.
Cite (Informal):
Geo-BERT Pre-training Model for Query Rewriting in POI Search (Liu et al., Findings 2021)
Copy Citation:
PDF:
https://aclanthology.org/2021.findings-emnlp.190.pdf
Video:
 https://aclanthology.org/2021.findings-emnlp.190.mp4