Utilizing Cross-Modal Contrastive Learning to Improve Item Categorization BERT Model

Lei Chen, Hou Wei Chou


Abstract
Item categorization (IC) is a core natural language processing (NLP) task in e-commerce. As a special text classification task, fine-tuning pre-trained models, e.g., BERT, has become a mainstream solution. To improve IC performance further, other product metadata, e.g., product images, have been used. Although multimodal IC (MIC) systems show higher performance, expanding from processing text to more resource-demanding images brings large engineering impacts and hinders the deployment of such dual-input MIC systems. In this paper, we proposed a new way of using product images to improve text-only IC model: leveraging cross-modal signals between products’ titles and associated images to adapt BERT models in a self-supervised learning (SSL) way. Our experiments on the three genres in the public Amazon product dataset show that the proposed method generates improved prediction accuracy and macro-F1 values than simply using the original BERT. Moreover, the proposed method is able to keep using existing text-only IC inference implementation and shows a resource advantage than the deployment of a dual-input MIC system.
Anthology ID:
2022.ecnlp-1.25
Volume:
Proceedings of the Fifth Workshop on e-Commerce and NLP (ECNLP 5)
Month:
May
Year:
2022
Address:
Dublin, Ireland
Editors:
Shervin Malmasi, Oleg Rokhlenko, Nicola Ueffing, Ido Guy, Eugene Agichtein, Surya Kallumadi
Venue:
ECNLP
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
217–223
Language:
URL:
https://aclanthology.org/2022.ecnlp-1.25
DOI:
10.18653/v1/2022.ecnlp-1.25
Bibkey:
Cite (ACL):
Lei Chen and Hou Wei Chou. 2022. Utilizing Cross-Modal Contrastive Learning to Improve Item Categorization BERT Model. In Proceedings of the Fifth Workshop on e-Commerce and NLP (ECNLP 5), pages 217–223, Dublin, Ireland. Association for Computational Linguistics.
Cite (Informal):
Utilizing Cross-Modal Contrastive Learning to Improve Item Categorization BERT Model (Chen & Chou, ECNLP 2022)
Copy Citation:
PDF:
https://aclanthology.org/2022.ecnlp-1.25.pdf
Video:
 https://aclanthology.org/2022.ecnlp-1.25.mp4