%0 Conference Proceedings %T Dependency Parsing as Head Selection %A Zhang, Xingxing %A Cheng, Jianpeng %A Lapata, Mirella %Y Lapata, Mirella %Y Blunsom, Phil %Y Koller, Alexander %S Proceedings of the 15th Conference of the European Chapter of the Association for Computational Linguistics: Volume 1, Long Papers %D 2017 %8 April %I Association for Computational Linguistics %C Valencia, Spain %F zhang-etal-2017-dependency %X Conventional graph-based dependency parsers guarantee a tree structure both during training and inference. Instead, we formalize dependency parsing as the problem of independently selecting the head of each word in a sentence. Our model which we call DENSE (as shorthand for Dependency Neural Selection) produces a distribution over possible heads for each word using features obtained from a bidirectional recurrent neural network. Without enforcing structural constraints during training, DeNSe generates (at inference time) trees for the overwhelming majority of sentences, while non-tree outputs can be adjusted with a maximum spanning tree algorithm. We evaluate DeNSe on four languages (English, Chinese, Czech, and German) with varying degrees of non-projectivity. Despite the simplicity of the approach, our parsers are on par with the state of the art. %U https://aclanthology.org/E17-1063 %P 665-676