Metric Learning for Dynamic Text Classification

Jeremy Wohlwend, Ethan R. Elenberg, Sam Altschul, Shawn Henry, Tao Lei


Abstract
Traditional text classifiers are limited to predicting over a fixed set of labels. However, in many real-world applications the label set is frequently changing. For example, in intent classification, new intents may be added over time while others are removed. We propose to address the problem of dynamic text classification by replacing the traditional, fixed-size output layer with a learned, semantically meaningful metric space. Here the distances between textual inputs are optimized to perform nearest-neighbor classification across overlapping label sets. Changing the label set does not involve removing parameters, but rather simply adding or removing support points in the metric space. Then the learned metric can be fine-tuned with only a few additional training examples. We demonstrate that this simple strategy is robust to changes in the label space. Furthermore, our results show that learning a non-Euclidean metric can improve performance in the low data regime, suggesting that further work on metric spaces may benefit low-resource research.
Anthology ID:
D19-6116
Volume:
Proceedings of the 2nd Workshop on Deep Learning Approaches for Low-Resource NLP (DeepLo 2019)
Month:
November
Year:
2019
Address:
Hong Kong, China
Editors:
Colin Cherry, Greg Durrett, George Foster, Reza Haffari, Shahram Khadivi, Nanyun Peng, Xiang Ren, Swabha Swayamdipta
Venue:
WS
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
143–152
Language:
URL:
https://aclanthology.org/D19-6116
DOI:
10.18653/v1/D19-6116
Bibkey:
Cite (ACL):
Jeremy Wohlwend, Ethan R. Elenberg, Sam Altschul, Shawn Henry, and Tao Lei. 2019. Metric Learning for Dynamic Text Classification. In Proceedings of the 2nd Workshop on Deep Learning Approaches for Low-Resource NLP (DeepLo 2019), pages 143–152, Hong Kong, China. Association for Computational Linguistics.
Cite (Informal):
Metric Learning for Dynamic Text Classification (Wohlwend et al., 2019)
Copy Citation:
PDF:
https://aclanthology.org/D19-6116.pdf
Code
 asappresearch/dynamic-classification
Data
WOS