Keisuke Nishiguchi
2021
Predicting Antonyms in Context using BERT
Ayana Niwa
|
Keisuke Nishiguchi
|
Naoaki Okazaki
Proceedings of the 14th International Conference on Natural Language Generation
We address the task of antonym prediction in a context, which is a fill-in-the-blanks problem. This task setting is unique and practical because it requires contrastiveness to the other word and naturalness as a text in filling a blank. We propose methods for fine-tuning pre-trained masked language models (BERT) for context-aware antonym prediction. The experimental results demonstrate that these methods have positive impacts on the prediction of antonyms within a context. Moreover, human evaluation reveals that more than 85% of predictions using the proposed method are acceptable as antonyms.