%0 Conference Proceedings %T Adaptation of Multilingual Transformer Encoder for Robust Enhanced Universal Dependency Parsing %A He, Han %A Choi, Jinho D. %Y Bouma, Gosse %Y Matsumoto, Yuji %Y Oepen, Stephan %Y Sagae, Kenji %Y Seddah, Djamé %Y Sun, Weiwei %Y Søgaard, Anders %Y Tsarfaty, Reut %Y Zeman, Dan %S Proceedings of the 16th International Conference on Parsing Technologies and the IWPT 2020 Shared Task on Parsing into Enhanced Universal Dependencies %D 2020 %8 July %I Association for Computational Linguistics %C Online %F he-choi-2020-adaptation %X This paper presents our enhanced dependency parsing approach using transformer encoders, coupled with a simple yet powerful ensemble algorithm that takes advantage of both tree and graph dependency parsing. Two types of transformer encoders are compared, a multilingual encoder and language-specific encoders. Our dependency tree parsing (DTP) approach generates only primary dependencies to form trees whereas our dependency graph parsing (DGP) approach handles both primary and secondary dependencies to form graphs. Since DGP does not guarantee the generated graphs are acyclic, the ensemble algorithm is designed to add secondary arcs predicted by DGP to primary arcs predicted by DTP. Our results show that models using the multilingual encoder outperform ones using the language specific encoders for most languages. The ensemble models generally show higher labeled attachment score on enhanced dependencies (ELAS) than the DTP and DGP models. As the result, our best models rank the third place on the macro-average ELAS over 17 languages. %R 10.18653/v1/2020.iwpt-1.19 %U https://aclanthology.org/2020.iwpt-1.19 %U https://doi.org/10.18653/v1/2020.iwpt-1.19 %P 181-191