Deploying Unified BERT Moderation Model for E-Commerce Reviews

Ravindra Nayak, Nikesh Garera


Abstract
Moderation of user-generated e-commerce content has become crucial due to the large and diverse user base on the platforms. Product reviews and ratings have become an integral part of the shopping experience to build trust among users. Due to the high volume of reviews generated on a vast catalog of products, manual moderation is infeasible, making machine moderation a necessity. In this work, we described our deployed system and models for automated moderation of user-generated content. At the heart of our approach, we outline several rejection reasons for review & rating moderation and explore a unified BERT model to moderate them. We convey the importance of product vertical embeddings for the relevancy of the review for a given product and highlight the advantages of pre-training the BERT models with monolingual data to cope with the domain gap in the absence of huge labelled datasets. We observe a 4.78% F1 increase with less labelled data and a 2.57% increase in F1 score on the review data compared to the publicly available BERT-based models. Our best model In-House-BERT-vertical sends only 5.89% of total reviews to manual moderation and has been deployed in production serving live traffic for millions of users.
Anthology ID:
2022.emnlp-industry.55
Volume:
Proceedings of the 2022 Conference on Empirical Methods in Natural Language Processing: Industry Track
Month:
December
Year:
2022
Address:
Abu Dhabi, UAE
Editors:
Yunyao Li, Angeliki Lazaridou
Venue:
EMNLP
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
540–547
Language:
URL:
https://aclanthology.org/2022.emnlp-industry.55
DOI:
10.18653/v1/2022.emnlp-industry.55
Bibkey:
Cite (ACL):
Ravindra Nayak and Nikesh Garera. 2022. Deploying Unified BERT Moderation Model for E-Commerce Reviews. In Proceedings of the 2022 Conference on Empirical Methods in Natural Language Processing: Industry Track, pages 540–547, Abu Dhabi, UAE. Association for Computational Linguistics.
Cite (Informal):
Deploying Unified BERT Moderation Model for E-Commerce Reviews (Nayak & Garera, EMNLP 2022)
Copy Citation:
PDF:
https://aclanthology.org/2022.emnlp-industry.55.pdf