%0 Conference Proceedings %T Investigating Effective Parameters for Fine-tuning of Word Embeddings Using Only a Small Corpus %A Komiya, Kanako %A Shinnou, Hiroyuki %Y Haffari, Reza %Y Cherry, Colin %Y Foster, George %Y Khadivi, Shahram %Y Salehi, Bahar %S Proceedings of the Workshop on Deep Learning Approaches for Low-Resource NLP %D 2018 %8 July %I Association for Computational Linguistics %C Melbourne %F komiya-shinnou-2018-investigating %X Fine-tuning is a popular method to achieve better performance when only a small target corpus is available. However, it requires tuning of a number of metaparameters and thus it might carry risk of adverse effect when inappropriate metaparameters are used. Therefore, we investigate effective parameters for fine-tuning when only a small target corpus is available. In the current study, we target at improving Japanese word embeddings created from a huge corpus. First, we demonstrate that even the word embeddings created from the huge corpus are affected by domain shift. After that, we investigate effective parameters for fine-tuning of the word embeddings using a small target corpus. We used perplexity of a language model obtained from a Long Short-Term Memory network to assess the word embeddings input into the network. The experiments revealed that fine-tuning sometimes give adverse effect when only a small target corpus is used and batch size is the most important parameter for fine-tuning. In addition, we confirmed that effect of fine-tuning is higher when size of a target corpus was larger. %R 10.18653/v1/W18-3408 %U https://aclanthology.org/W18-3408 %U https://doi.org/10.18653/v1/W18-3408 %P 60-67