Unsupervised Anomaly Detection in Multi-Topic Short-Text Corpora

Mira Ait-Saada, Mohamed Nadif


Abstract
Unsupervised anomaly detection seeks to identify deviant data samples in a dataset without using labels and constitutes a challenging task, particularly when the majority class is heterogeneous. This paper addresses this topic for textual data and aims to determine whether a text sample is an outlier within a potentially multi-topic corpus. To this end, it is crucial to grasp the semantic aspects of words, particularly when dealing with short texts, since it is difficult to syntactically discriminate data samples based only on a few words. Thereby we make use of word embeddings to represent each sample by a dense vector, efficiently capturing the underlying semantics. Then, we rely on the Mixture Model approach to detect which samples deviate the most from the underlying distributions of the corpus. Experiments carried out on real datasets show the effectiveness of the proposed approach in comparison to state-of-the-art techniques both in terms of performance and time efficiency, especially when more than one topic is present in the corpus.
Anthology ID:
2023.eacl-main.101
Volume:
Proceedings of the 17th Conference of the European Chapter of the Association for Computational Linguistics
Month:
May
Year:
2023
Address:
Dubrovnik, Croatia
Editors:
Andreas Vlachos, Isabelle Augenstein
Venue:
EACL
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
1392–1403
Language:
URL:
https://aclanthology.org/2023.eacl-main.101
DOI:
10.18653/v1/2023.eacl-main.101
Bibkey:
Cite (ACL):
Mira Ait-Saada and Mohamed Nadif. 2023. Unsupervised Anomaly Detection in Multi-Topic Short-Text Corpora. In Proceedings of the 17th Conference of the European Chapter of the Association for Computational Linguistics, pages 1392–1403, Dubrovnik, Croatia. Association for Computational Linguistics.
Cite (Informal):
Unsupervised Anomaly Detection in Multi-Topic Short-Text Corpora (Ait-Saada & Nadif, EACL 2023)
Copy Citation:
PDF:
https://aclanthology.org/2023.eacl-main.101.pdf
Video:
 https://aclanthology.org/2023.eacl-main.101.mp4