Coupling Context Modeling with Zero Pronoun Recovering for Document-Level Natural Language Generation

Xin Tan, Longyin Zhang, Guodong Zhou


Abstract
Natural language generation (NLG) tasks on pro-drop languages are known to suffer from zero pronoun (ZP) problems, and the problems remain challenging due to the scarcity of ZP-annotated NLG corpora. In this case, we propose a highly adaptive two-stage approach to couple context modeling with ZP recovering to mitigate the ZP problem in NLG tasks. Notably, we frame the recovery process in a task-supervised fashion where the ZP representation recovering capability is learned during the NLG task learning process, thus our method does not require NLG corpora annotated with ZPs. For system enhancement, we learn an adversarial bot to adjust our model outputs to alleviate the error propagation caused by mis-recovered ZPs. Experiments on three document-level NLG tasks, i.e., machine translation, question answering, and summarization, show that our approach can improve the performance to a great extent, and the improvement on pronoun translation is very impressive.
Anthology ID:
2021.emnlp-main.197
Volume:
Proceedings of the 2021 Conference on Empirical Methods in Natural Language Processing
Month:
November
Year:
2021
Address:
Online and Punta Cana, Dominican Republic
Editors:
Marie-Francine Moens, Xuanjing Huang, Lucia Specia, Scott Wen-tau Yih
Venue:
EMNLP
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
2530–2540
Language:
URL:
https://aclanthology.org/2021.emnlp-main.197
DOI:
10.18653/v1/2021.emnlp-main.197
Bibkey:
Cite (ACL):
Xin Tan, Longyin Zhang, and Guodong Zhou. 2021. Coupling Context Modeling with Zero Pronoun Recovering for Document-Level Natural Language Generation. In Proceedings of the 2021 Conference on Empirical Methods in Natural Language Processing, pages 2530–2540, Online and Punta Cana, Dominican Republic. Association for Computational Linguistics.
Cite (Informal):
Coupling Context Modeling with Zero Pronoun Recovering for Document-Level Natural Language Generation (Tan et al., EMNLP 2021)
Copy Citation:
PDF:
https://aclanthology.org/2021.emnlp-main.197.pdf
Video:
 https://aclanthology.org/2021.emnlp-main.197.mp4
Code
 txannie/zp-dnlg
Data
MATINF