Pseudo Zero Pronoun Resolution Improves Zero Anaphora Resolution

Ryuto Konno, Shun Kiyono, Yuichiroh Matsubayashi, Hiroki Ouchi, Kentaro Inui


Abstract
Masked language models (MLMs) have contributed to drastic performance improvements with regard to zero anaphora resolution (ZAR). To further improve this approach, in this study, we made two proposals. The first is a new pretraining task that trains MLMs on anaphoric relations with explicit supervision, and the second proposal is a new finetuning method that remedies a notorious issue, the pretrain-finetune discrepancy. Our experiments on Japanese ZAR demonstrated that our two proposals boost the state-of-the-art performance, and our detailed analysis provides new insights on the remaining challenges.
Anthology ID:
2021.emnlp-main.308
Volume:
Proceedings of the 2021 Conference on Empirical Methods in Natural Language Processing
Month:
November
Year:
2021
Address:
Online and Punta Cana, Dominican Republic
Editors:
Marie-Francine Moens, Xuanjing Huang, Lucia Specia, Scott Wen-tau Yih
Venue:
EMNLP
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
3790–3806
Language:
URL:
https://aclanthology.org/2021.emnlp-main.308
DOI:
10.18653/v1/2021.emnlp-main.308
Bibkey:
Cite (ACL):
Ryuto Konno, Shun Kiyono, Yuichiroh Matsubayashi, Hiroki Ouchi, and Kentaro Inui. 2021. Pseudo Zero Pronoun Resolution Improves Zero Anaphora Resolution. In Proceedings of the 2021 Conference on Empirical Methods in Natural Language Processing, pages 3790–3806, Online and Punta Cana, Dominican Republic. Association for Computational Linguistics.
Cite (Informal):
Pseudo Zero Pronoun Resolution Improves Zero Anaphora Resolution (Konno et al., EMNLP 2021)
Copy Citation:
PDF:
https://aclanthology.org/2021.emnlp-main.308.pdf
Video:
 https://aclanthology.org/2021.emnlp-main.308.mp4
Code
 ryuto10/pzero-improves-zar