Multi-word Term Embeddings Improve Lexical Product Retrieval

Viktor Shcherbakov, Fedor Krasnov


Abstract
Product search is uniquely different from search for documents, Internet resources or vacancies, therefore it requires the development of specialized search systems. The present work describes the H1 embdedding model, designed for an offline term indexing of product descriptions at e-commerce platforms. The model is compared to other state-of-the-art (SoTA) embedding models within a framework of hybrid product search system that incorporates the advantages of lexical methods for product retrieval and semantic embedding-based methods. We propose an approach to building semantically rich term vocabularies for search indexes. Compared to other production semantic models, H1 paired with the proposed approach stands out due to its ability to process multi-word product terms as one token. As an example, for search queries “new balance shoes”, “gloria jeans kids wear” brand entity will be represented as one token - “new balance”, “gloria jeans”. This results in an increased precision of the system without affecting the recall. The hybrid search system with proposed model scores mAP@12 = 56.1% and R@1k = 86.6% on the WANDS public dataset, beating other SoTA analogues.
Anthology ID:
2024.ecnlp-1.12
Volume:
Proceedings of the Seventh Workshop on e-Commerce and NLP @ LREC-COLING 2024
Month:
May
Year:
2024
Address:
Torino, Italia
Editors:
Shervin Malmasi, Besnik Fetahu, Nicola Ueffing, Oleg Rokhlenko, Eugene Agichtein, Ido Guy
Venues:
ECNLP | WS
SIG:
Publisher:
ELRA and ICCL
Note:
Pages:
115–124
Language:
URL:
https://aclanthology.org/2024.ecnlp-1.12
DOI:
Bibkey:
Cite (ACL):
Viktor Shcherbakov and Fedor Krasnov. 2024. Multi-word Term Embeddings Improve Lexical Product Retrieval. In Proceedings of the Seventh Workshop on e-Commerce and NLP @ LREC-COLING 2024, pages 115–124, Torino, Italia. ELRA and ICCL.
Cite (Informal):
Multi-word Term Embeddings Improve Lexical Product Retrieval (Shcherbakov & Krasnov, ECNLP-WS 2024)
Copy Citation:
PDF:
https://aclanthology.org/2024.ecnlp-1.12.pdf