Semantic-aware Chinese Zero Pronoun Resolution with Pre-trained Semantic Dependency Parser

Lanqiu Zhang, Zizhuo Shen, Yanqiu Shao


Abstract
Deep learning-based Chinese zero pronoun resolution model has achieved better performance than traditional machine learning-based model. However, the existing work related to Chinese zero pronoun resolution has not yet well integrated linguistic information into the deep learningbased Chinese zero pronoun resolution model. This paper adopts the idea based on the pre-trained model, and integrates the semantic representations in the pre-trained Chinese semantic dependency graph parser into the Chinese zero pronoun resolution model. The experimental results on OntoNotes-5.0 dataset show that our proposed Chinese zero pronoun resolution model with pretrained Chinese semantic dependency parser improves the F-score by 0.4% compared with our baseline model, and obtains better results than other deep learning-based Chinese zero pronoun resolution models. In addition, we integrate the BERT representations into our model so that the performance of our model was improved by 0.7% compared with our baseline model.
Anthology ID:
2020.ccl-1.77
Volume:
Proceedings of the 19th Chinese National Conference on Computational Linguistics
Month:
October
Year:
2020
Address:
Haikou, China
Editors:
Maosong Sun (孙茂松), Sujian Li (李素建), Yue Zhang (张岳), Yang Liu (刘洋)
Venue:
CCL
SIG:
Publisher:
Chinese Information Processing Society of China
Note:
Pages:
831–841
Language:
English
URL:
https://aclanthology.org/2020.ccl-1.77
DOI:
Bibkey:
Cite (ACL):
Lanqiu Zhang, Zizhuo Shen, and Yanqiu Shao. 2020. Semantic-aware Chinese Zero Pronoun Resolution with Pre-trained Semantic Dependency Parser. In Proceedings of the 19th Chinese National Conference on Computational Linguistics, pages 831–841, Haikou, China. Chinese Information Processing Society of China.
Cite (Informal):
Semantic-aware Chinese Zero Pronoun Resolution with Pre-trained Semantic Dependency Parser (Zhang et al., CCL 2020)
Copy Citation:
PDF:
https://aclanthology.org/2020.ccl-1.77.pdf