%0 Conference Proceedings %T Tiny Word Embeddings Using Globally Informed Reconstruction %A Ohashi, Sora %A Isogawa, Mao %A Kajiwara, Tomoyuki %A Arase, Yuki %Y Scott, Donia %Y Bel, Nuria %Y Zong, Chengqing %S Proceedings of the 28th International Conference on Computational Linguistics %D 2020 %8 December %I International Committee on Computational Linguistics %C Barcelona, Spain (Online) %F ohashi-etal-2020-tiny %X We reduce the model size of pre-trained word embeddings by a factor of 200 while preserving its quality. Previous studies in this direction created a smaller word embedding model by reconstructing pre-trained word representations from those of subwords, which allows to store only a smaller number of subword embeddings in the memory. However, previous studies that train the reconstruction models using only target words cannot reduce the model size extremely while preserving its quality. Inspired by the observation of words with similar meanings having similar embeddings, our reconstruction training learns the global relationships among words, which can be employed in various models for word embedding reconstruction. Experimental results on word similarity benchmarks show that the proposed method improves the performance of the all subword-based reconstruction models. %R 10.18653/v1/2020.coling-main.103 %U https://aclanthology.org/2020.coling-main.103 %U https://doi.org/10.18653/v1/2020.coling-main.103 %P 1199-1203