MTG: A Benchmark Suite for Multilingual Text Generation

Yiran Chen, Zhenqiao Song, Xianze Wu, Danqing Wang, Jingjing Xu, Jiaze Chen, Hao Zhou, Lei Li


Abstract
We introduce MTG, a new benchmark suite for training and evaluating multilingual text generation. It is the first-proposed multilingual multiway text generation dataset with the largest human-annotated data (400k). It includes four generation tasks (story generation, question generation, title generation and text summarization) across five languages (English, German, French, Spanish and Chinese). The multiway setup enables testing knowledge transfer capabilities for a model across languages and tasks. Using MTG, we train and analyze several popular multilingual generation models from different aspects. Our benchmark suite fosters model performance enhancement with more human-annotated parallel data. It provides comprehensive evaluations with diverse generation scenarios. Code and data are available at https://github.com/zide05/MTG.
Anthology ID:
2022.findings-naacl.192
Volume:
Findings of the Association for Computational Linguistics: NAACL 2022
Month:
July
Year:
2022
Address:
Seattle, United States
Editors:
Marine Carpuat, Marie-Catherine de Marneffe, Ivan Vladimir Meza Ruiz
Venue:
Findings
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
2508–2527
Language:
URL:
https://aclanthology.org/2022.findings-naacl.192
DOI:
10.18653/v1/2022.findings-naacl.192
Bibkey:
Cite (ACL):
Yiran Chen, Zhenqiao Song, Xianze Wu, Danqing Wang, Jingjing Xu, Jiaze Chen, Hao Zhou, and Lei Li. 2022. MTG: A Benchmark Suite for Multilingual Text Generation. In Findings of the Association for Computational Linguistics: NAACL 2022, pages 2508–2527, Seattle, United States. Association for Computational Linguistics.
Cite (Informal):
MTG: A Benchmark Suite for Multilingual Text Generation (Chen et al., Findings 2022)
Copy Citation:
PDF:
https://aclanthology.org/2022.findings-naacl.192.pdf
Video:
 https://aclanthology.org/2022.findings-naacl.192.mp4
Code
 zide05/mtg
Data
CNN/Daily MailGEMROCStoriesSQuAD