Effective Data Augmentation for Sentence Classification Using One VAE per Class

Frédéric Piedboeuf, Philippe Langlais


Abstract
In recent years, data augmentation has become an important field of machine learning. While images can use simple techniques such as cropping or rotating, textual data augmentation needs more complex manipulations to ensure that the generated examples are useful. Variational auto-encoders (VAE) and its conditional variant the Conditional-VAE (CVAE) are often used to generate new textual data, both relying on a good enough training of the generator so that it doesn’t create examples of the wrong class. In this paper, we explore a simpler way to use VAE for data augmentation: the training of one VAE per class. We show on several dataset sizes, as well as on four different binary classification tasks, that it systematically outperforms other generative data augmentation techniques.
Anthology ID:
2022.coling-1.305
Volume:
Proceedings of the 29th International Conference on Computational Linguistics
Month:
October
Year:
2022
Address:
Gyeongju, Republic of Korea
Editors:
Nicoletta Calzolari, Chu-Ren Huang, Hansaem Kim, James Pustejovsky, Leo Wanner, Key-Sun Choi, Pum-Mo Ryu, Hsin-Hsi Chen, Lucia Donatelli, Heng Ji, Sadao Kurohashi, Patrizia Paggio, Nianwen Xue, Seokhwan Kim, Younggyun Hahm, Zhong He, Tony Kyungil Lee, Enrico Santus, Francis Bond, Seung-Hoon Na
Venue:
COLING
SIG:
Publisher:
International Committee on Computational Linguistics
Note:
Pages:
3454–3464
Language:
URL:
https://aclanthology.org/2022.coling-1.305
DOI:
Bibkey:
Cite (ACL):
Frédéric Piedboeuf and Philippe Langlais. 2022. Effective Data Augmentation for Sentence Classification Using One VAE per Class. In Proceedings of the 29th International Conference on Computational Linguistics, pages 3454–3464, Gyeongju, Republic of Korea. International Committee on Computational Linguistics.
Cite (Informal):
Effective Data Augmentation for Sentence Classification Using One VAE per Class (Piedboeuf & Langlais, COLING 2022)
Copy Citation:
PDF:
https://aclanthology.org/2022.coling-1.305.pdf
Data
SSTSST-2