ODIST: Open World Classification via Distributionally Shifted Instances

Lei Shu, Yassine Benajiba, Saab Mansour, Yi Zhang


Abstract
In this work, we address the open-world classification problem with a method called ODIST, open world classification via distributionally shifted instances. This novel and straightforward method can create out-of-domain instances from the in-domain training instances with the help of a pre-trained generative language model. Experimental results show that ODIST performs better than state-of-the-art decision boundary finding method.
Anthology ID:
2021.findings-emnlp.316
Volume:
Findings of the Association for Computational Linguistics: EMNLP 2021
Month:
November
Year:
2021
Address:
Punta Cana, Dominican Republic
Editors:
Marie-Francine Moens, Xuanjing Huang, Lucia Specia, Scott Wen-tau Yih
Venue:
Findings
SIG:
SIGDAT
Publisher:
Association for Computational Linguistics
Note:
Pages:
3751–3756
Language:
URL:
https://aclanthology.org/2021.findings-emnlp.316
DOI:
10.18653/v1/2021.findings-emnlp.316
Bibkey:
Cite (ACL):
Lei Shu, Yassine Benajiba, Saab Mansour, and Yi Zhang. 2021. ODIST: Open World Classification via Distributionally Shifted Instances. In Findings of the Association for Computational Linguistics: EMNLP 2021, pages 3751–3756, Punta Cana, Dominican Republic. Association for Computational Linguistics.
Cite (Informal):
ODIST: Open World Classification via Distributionally Shifted Instances (Shu et al., Findings 2021)
Copy Citation:
PDF:
https://aclanthology.org/2021.findings-emnlp.316.pdf
Video:
 https://aclanthology.org/2021.findings-emnlp.316.mp4
Data
MultiNLI