%0 Conference Proceedings %T Universal Dependencies According to BERT: Both More Specific and More General %A Limisiewicz, Tomasz %A Mareček, David %A Rosa, Rudolf %Y Cohn, Trevor %Y He, Yulan %Y Liu, Yang %S Findings of the Association for Computational Linguistics: EMNLP 2020 %D 2020 %8 November %I Association for Computational Linguistics %C Online %F limisiewicz-etal-2020-universal %X This work focuses on analyzing the form and extent of syntactic abstraction captured by BERT by extracting labeled dependency trees from self-attentions. Previous work showed that individual BERT heads tend to encode particular dependency relation types. We extend these findings by explicitly comparing BERT relations to Universal Dependencies (UD) annotations, showing that they often do not match one-to-one. We suggest a method for relation identification and syntactic tree construction. Our approach produces significantly more consistent dependency trees than previous work, showing that it better explains the syntactic abstractions in BERT. At the same time, it can be successfully applied with only a minimal amount of supervision and generalizes well across languages. %R 10.18653/v1/2020.findings-emnlp.245 %U https://aclanthology.org/2020.findings-emnlp.245 %U https://doi.org/10.18653/v1/2020.findings-emnlp.245 %P 2710-2722