%0 Conference Proceedings %T APGN: Adversarial and Parameter Generation Networks for Multi-Source Cross-Domain Dependency Parsing %A Li, Ying %A Zhang, Meishan %A Li, Zhenghua %A Zhang, Min %A Wang, Zhefeng %A Huai, Baoxing %A Yuan, Nicholas Jing %Y Moens, Marie-Francine %Y Huang, Xuanjing %Y Specia, Lucia %Y Yih, Scott Wen-tau %S Findings of the Association for Computational Linguistics: EMNLP 2021 %D 2021 %8 November %I Association for Computational Linguistics %C Punta Cana, Dominican Republic %F li-etal-2021-apgn-adversarial %X Thanks to the strong representation learning capability of deep learning, especially pre-training techniques with language model loss, dependency parsing has achieved great performance boost in the in-domain scenario with abundant labeled training data for target domains. However, the parsing community has to face the more realistic setting where the parsing performance drops drastically when labeled data only exists for several fixed out-domains. In this work, we propose a novel model for multi-source cross-domain dependency parsing. The model consists of two components, i.e., a parameter generation network for distinguishing domain-specific features, and an adversarial network for learning domain-invariant representations. Experiments on a recently released NLPCC-2019 dataset for multi-domain dependency parsing show that our model can consistently improve cross-domain parsing performance by about 2 points in averaged labeled attachment accuracy (LAS) over strong BERT-enhanced baselines. Detailed analysis is conducted to gain more insights on contributions of the two components. %R 10.18653/v1/2021.findings-emnlp.149 %U https://aclanthology.org/2021.findings-emnlp.149 %U https://doi.org/10.18653/v1/2021.findings-emnlp.149 %P 1724-1733