Counter-Contrastive Learning for Language GANs

Yekun Chai, Haidong Zhang, Qiyue Yin, Junge Zhang


Abstract
Generative Adversarial Networks (GANs) have achieved great success in image synthesis, but have proven to be difficult to generate natural language. Challenges arise from the uninformative learning signals passed from the discriminator. In other words, the poor learning signals limit the learning capacity for generating languages with rich structures and semantics. In this paper, we propose to adopt the counter-contrastive learning (CCL) method to support the generator’s training in language GANs. In contrast to standard GANs that adopt a simple binary classifier to discriminate whether a sample is real or fake, we employ a counter-contrastive learning signal that advances the training of language synthesizers by (1) pulling the language representations of generated and real samples together and (2) pushing apart representations of real samples to compete with the discriminator and thus prevent the discriminator from being overtrained. We evaluate our method on both synthetic and real benchmarks and yield competitive performance compared to previous GANs for adversarial sequence generation.
Anthology ID:
2021.findings-emnlp.415
Volume:
Findings of the Association for Computational Linguistics: EMNLP 2021
Month:
November
Year:
2021
Address:
Punta Cana, Dominican Republic
Editors:
Marie-Francine Moens, Xuanjing Huang, Lucia Specia, Scott Wen-tau Yih
Venue:
Findings
SIG:
SIGDAT
Publisher:
Association for Computational Linguistics
Note:
Pages:
4834–4839
Language:
URL:
https://aclanthology.org/2021.findings-emnlp.415
DOI:
10.18653/v1/2021.findings-emnlp.415
Bibkey:
Cite (ACL):
Yekun Chai, Haidong Zhang, Qiyue Yin, and Junge Zhang. 2021. Counter-Contrastive Learning for Language GANs. In Findings of the Association for Computational Linguistics: EMNLP 2021, pages 4834–4839, Punta Cana, Dominican Republic. Association for Computational Linguistics.
Cite (Informal):
Counter-Contrastive Learning for Language GANs (Chai et al., Findings 2021)
Copy Citation:
PDF:
https://aclanthology.org/2021.findings-emnlp.415.pdf
Video:
 https://aclanthology.org/2021.findings-emnlp.415.mp4