Single Example Can Improve Zero-Shot Data Generation

Pavel Burnyshev, Valentin Malykh, Andrey Bout, Ekaterina Artemova, Irina Piontkovskaya


Abstract
Sub-tasks of intent classification, such as robustness to distribution shift, adaptation to specific user groups and personalization, out-of-domain detection, require extensive and flexible datasets for experiments and evaluation. As collecting such datasets is time- and labor-consuming, we propose to use text generation methods to gather datasets. The generator should be trained to generate utterances that belong to the given intent. We explore two approaches to the generation of task-oriented utterances: in the zero-shot approach, the model is trained to generate utterances from seen intents and is further used to generate utterances for intents unseen during training. In the one-shot approach, the model is presented with a single utterance from a test intent. We perform a thorough automatic, and human evaluation of the intrinsic properties of two-generation approaches. The attributes of the generated data are close to original test sets, collected via crowd-sourcing.
Anthology ID:
2021.inlg-1.20
Volume:
Proceedings of the 14th International Conference on Natural Language Generation
Month:
August
Year:
2021
Address:
Aberdeen, Scotland, UK
Venue:
INLG
SIG:
SIGGEN
Publisher:
Association for Computational Linguistics
Note:
Pages:
201–211
Language:
URL:
https://aclanthology.org/2021.inlg-1.20
DOI:
Bibkey:
Cite (ACL):
Pavel Burnyshev, Valentin Malykh, Andrey Bout, Ekaterina Artemova, and Irina Piontkovskaya. 2021. Single Example Can Improve Zero-Shot Data Generation. In Proceedings of the 14th International Conference on Natural Language Generation, pages 201–211, Aberdeen, Scotland, UK. Association for Computational Linguistics.
Cite (Informal):
Single Example Can Improve Zero-Shot Data Generation (Burnyshev et al., INLG 2021)
Copy Citation:
PDF:
https://aclanthology.org/2021.inlg-1.20.pdf
Data
SGD