%0 Journal Article %T Exploring Compositional Architectures and Word Vector Representations for Prepositional Phrase Attachment %A Belinkov, Yonatan %A Lei, Tao %A Barzilay, Regina %A Globerson, Amir %J Transactions of the Association for Computational Linguistics %D 2014 %V 2 %I MIT Press %C Cambridge, MA %F belinkov-etal-2014-exploring %X Prepositional phrase (PP) attachment disambiguation is a known challenge in syntactic parsing. The lexical sparsity associated with PP attachments motivates research in word representations that can capture pertinent syntactic and semantic features of the word. One promising solution is to use word vectors induced from large amounts of raw text. However, state-of-the-art systems that employ such representations yield modest gains in PP attachment accuracy. In this paper, we show that word vector representations can yield significant PP attachment performance gains. This is achieved via a non-linear architecture that is discriminatively trained to maximize PP attachment accuracy. The architecture is initialized with word vectors trained from unlabeled data, and relearns those to maximize attachment accuracy. We obtain additional performance gains with alternative representations such as dependency-based word vectors. When tested on both English and Arabic datasets, our method outperforms both a strong SVM classifier and state-of-the-art parsers. For instance, we achieve 82.6% PP attachment accuracy on Arabic, while the Turbo and Charniak self-trained parsers obtain 76.7% and 80.8% respectively. %R 10.1162/tacl_a_00203 %U https://aclanthology.org/Q14-1043 %U https://doi.org/10.1162/tacl_a_00203 %P 561-572