Japanese Zero Anaphora Resolution Can Benefit from Parallel Texts Through Neural Transfer Learning

Masato Umakoshi, Yugo Murawaki, Sadao Kurohashi


Abstract
Parallel texts of Japanese and a non-pro-drop language have the potential of improving the performance of Japanese zero anaphora resolution (ZAR) because pronouns dropped in the former are usually mentioned explicitly in the latter. However, rule-based cross-lingual transfer is hampered by error propagation in an NLP pipeline and the frequent lack of transparency in translation correspondences. In this paper, we propose implicit transfer by injecting machine translation (MT) as an intermediate task between pretraining and ZAR. We employ a pretrained BERT model to initialize the encoder part of the encoder-decoder model for MT, and eject the encoder part for fine-tuning on ZAR. The proposed framework empirically demonstrates that ZAR performance can be improved by transfer learning from MT. In addition, we find that the incorporation of the masked language model training into MT leads to further gains.
Anthology ID:
2021.findings-emnlp.165
Volume:
Findings of the Association for Computational Linguistics: EMNLP 2021
Month:
November
Year:
2021
Address:
Punta Cana, Dominican Republic
Editors:
Marie-Francine Moens, Xuanjing Huang, Lucia Specia, Scott Wen-tau Yih
Venue:
Findings
SIG:
SIGDAT
Publisher:
Association for Computational Linguistics
Note:
Pages:
1920–1934
Language:
URL:
https://aclanthology.org/2021.findings-emnlp.165
DOI:
10.18653/v1/2021.findings-emnlp.165
Bibkey:
Cite (ACL):
Masato Umakoshi, Yugo Murawaki, and Sadao Kurohashi. 2021. Japanese Zero Anaphora Resolution Can Benefit from Parallel Texts Through Neural Transfer Learning. In Findings of the Association for Computational Linguistics: EMNLP 2021, pages 1920–1934, Punta Cana, Dominican Republic. Association for Computational Linguistics.
Cite (Informal):
Japanese Zero Anaphora Resolution Can Benefit from Parallel Texts Through Neural Transfer Learning (Umakoshi et al., Findings 2021)
Copy Citation:
PDF:
https://aclanthology.org/2021.findings-emnlp.165.pdf
Video:
 https://aclanthology.org/2021.findings-emnlp.165.mp4