On-Device Text Representations Robust To Misspellings via Projections

Chinnadhurai Sankar, Sujith Ravi, Zornitsa Kozareva


Abstract
Recently, there has been a strong interest in developing natural language applications that live on personal devices such as mobile phones, watches and IoT with the objective to preserve user privacy and have low memory. Advances in Locality-Sensitive Hashing (LSH)-based projection networks have demonstrated state-of-the-art performance in various classification tasks without explicit word (or word-piece) embedding lookup tables by computing on-the-fly text representations. In this paper, we show that the projection based neural classifiers are inherently robust to misspellings and perturbations of the input text. We empirically demonstrate that the LSH projection based classifiers are more robust to common misspellings compared to BiLSTMs (with both word-piece & word-only tokenization) and fine-tuned BERT based methods. When subject to misspelling attacks, LSH projection based classifiers had a small average accuracy drop of 2.94% across multiple classifications tasks, while the fine-tuned BERT model accuracy had a significant drop of 11.44%.
Anthology ID:
2021.eacl-main.250
Volume:
Proceedings of the 16th Conference of the European Chapter of the Association for Computational Linguistics: Main Volume
Month:
April
Year:
2021
Address:
Online
Venue:
EACL
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
2871–2876
Language:
URL:
https://aclanthology.org/2021.eacl-main.250
DOI:
10.18653/v1/2021.eacl-main.250
Bibkey:
Cite (ACL):
Chinnadhurai Sankar, Sujith Ravi, and Zornitsa Kozareva. 2021. On-Device Text Representations Robust To Misspellings via Projections. In Proceedings of the 16th Conference of the European Chapter of the Association for Computational Linguistics: Main Volume, pages 2871–2876, Online. Association for Computational Linguistics.
Cite (Informal):
On-Device Text Representations Robust To Misspellings via Projections (Sankar et al., EACL 2021)
Copy Citation:
PDF:
https://aclanthology.org/2021.eacl-main.250.pdf