Joint Generator-Ranker Learning for Natural Language Generation

Weizhou Shen, Yeyun Gong, Yelong Shen, Song Wang, Xiaojun Quan, Nan Duan, Weizhu Chen


Abstract
Generate-then-rank is a widely used mechanism for text generation, where a generator produces multiple text candidates and a ranker chooses the best one among the text candidates. However, existing methods usually train the generator and the ranker individually, neglecting the mutual feedback that could further enhance the generation quality. To tackle this limitation, we propose JGR, a novel joint training algorithm that integrates the generator and the ranker in a single framework. JGR optimizes the generator with a hybrid objective that combines data likelihood and ranker reward, and trains the ranker with a contrastive loss that compares the generator outputs. By iteratively updating the generator and the ranker, JGR can effectively harmonize their learning and enhance their quality jointly. We evaluate JGR on various text generation tasks and demonstrate that it surpasses existing methods on four public datasets across three common generation scenarios. Our code and models are publicly available at https://github.com/microsoft/ProphetNet/tree/master/JGR.
Anthology ID:
2023.findings-acl.486
Volume:
Findings of the Association for Computational Linguistics: ACL 2023
Month:
July
Year:
2023
Address:
Toronto, Canada
Editors:
Anna Rogers, Jordan Boyd-Graber, Naoaki Okazaki
Venue:
Findings
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
7681–7699
Language:
URL:
https://aclanthology.org/2023.findings-acl.486
DOI:
10.18653/v1/2023.findings-acl.486
Bibkey:
Cite (ACL):
Weizhou Shen, Yeyun Gong, Yelong Shen, Song Wang, Xiaojun Quan, Nan Duan, and Weizhu Chen. 2023. Joint Generator-Ranker Learning for Natural Language Generation. In Findings of the Association for Computational Linguistics: ACL 2023, pages 7681–7699, Toronto, Canada. Association for Computational Linguistics.
Cite (Informal):
Joint Generator-Ranker Learning for Natural Language Generation (Shen et al., Findings 2023)
Copy Citation:
PDF:
https://aclanthology.org/2023.findings-acl.486.pdf