%0 Conference Proceedings %T A Distribution-based Model to Learn Bilingual Word Embeddings %A Cao, Hailong %A Zhao, Tiejun %A Zhang, Shu %A Meng, Yao %Y Matsumoto, Yuji %Y Prasad, Rashmi %S Proceedings of COLING 2016, the 26th International Conference on Computational Linguistics: Technical Papers %D 2016 %8 December %I The COLING 2016 Organizing Committee %C Osaka, Japan %F cao-etal-2016-distribution %X We introduce a distribution based model to learn bilingual word embeddings from monolingual data. It is simple, effective and does not require any parallel data or any seed lexicon. We take advantage of the fact that word embeddings are usually in form of dense real-valued low-dimensional vector and therefore the distribution of them can be accurately estimated. A novel cross-lingual learning objective is proposed which directly matches the distributions of word embeddings in one language with that in the other language. During the joint learning process, we dynamically estimate the distributions of word embeddings in two languages respectively and minimize the dissimilarity between them through standard back propagation algorithm. Our learned bilingual word embeddings allow to group each word and its translations together in the shared vector space. We demonstrate the utility of the learned embeddings on the task of finding word-to-word translations from monolingual corpora. Our model achieved encouraging performance on data in both related languages and substantially different languages. %U https://aclanthology.org/C16-1171 %P 1818-1827