Natural Response Generation for Chinese Reading Comprehension

Nuo Chen, Hongguang Li, Yinan Bao, Baoyuan Wang, Jia Li


Abstract
Machine reading comprehension (MRC) is an important area of conversation agents and draws a lot of attention. However, there is a notable limitation to current MRC benchmarks: The labeled answers are mostly either spans extracted from the target corpus or the choices of the given candidates, ignoring the natural aspect of high-quality responses. As a result, MRC models trained on these datasets can not generate human-like responses in real QA scenarios. To this end, we construct a new dataset called Penguin to promote the research of MRC, providing a training and test bed for natural response generation to real scenarios. Concretely, Penguin consists of 200k training data with high-quality fluent, and well-informed responses. Penguin is the first benchmark towards natural response generation in Chinese MRC on a relatively large scale. To address the challenges in Penguin, we develop two strong baselines: end-to-end and two-stage frameworks. Following that, we further design Prompt-BART: fine-tuning the pre-trained generative language models with a mixture of prefix prompts in Penguin. Extensive experiments validated the effectiveness of this design.
Anthology ID:
2023.findings-emnlp.739
Volume:
Findings of the Association for Computational Linguistics: EMNLP 2023
Month:
December
Year:
2023
Address:
Singapore
Editors:
Houda Bouamor, Juan Pino, Kalika Bali
Venue:
Findings
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
11068–11081
Language:
URL:
https://aclanthology.org/2023.findings-emnlp.739
DOI:
10.18653/v1/2023.findings-emnlp.739
Bibkey:
Cite (ACL):
Nuo Chen, Hongguang Li, Yinan Bao, Baoyuan Wang, and Jia Li. 2023. Natural Response Generation for Chinese Reading Comprehension. In Findings of the Association for Computational Linguistics: EMNLP 2023, pages 11068–11081, Singapore. Association for Computational Linguistics.
Cite (Informal):
Natural Response Generation for Chinese Reading Comprehension (Chen et al., Findings 2023)
Copy Citation:
PDF:
https://aclanthology.org/2023.findings-emnlp.739.pdf