2020
pdf
bib
abs
Adapting BERT to Implicit Discourse Relation Classification with a Focus on Discourse Connectives
Yudai Kishimoto
|
Yugo Murawaki
|
Sadao Kurohashi
Proceedings of the Twelfth Language Resources and Evaluation Conference
BERT, a neural network-based language model pre-trained on large corpora, is a breakthrough in natural language processing, significantly outperforming previous state-of-the-art models in numerous tasks. However, there have been few reports on its application to implicit discourse relation classification, and it is not clear how BERT is best adapted to the task. In this paper, we test three methods of adaptation. (1) We perform additional pre-training on text tailored to discourse classification. (2) In expectation of knowledge transfer from explicit discourse relations to implicit discourse relations, we add a task named explicit connective prediction at the additional pre-training step. (3) To exploit implicit connectives given by treebank annotators, we add a task named implicit connective prediction at the fine-tuning step. We demonstrate that these three techniques can be combined straightforwardly in a single training pipeline. Through comprehensive experiments, we found that the first and second techniques provide additional gain while the last one did not.
2018
pdf
bib
abs
A Knowledge-Augmented Neural Network Model for Implicit Discourse Relation Classification
Yudai Kishimoto
|
Yugo Murawaki
|
Sadao Kurohashi
Proceedings of the 27th International Conference on Computational Linguistics
Identifying discourse relations that are not overtly marked with discourse connectives remains a challenging problem. The absence of explicit clues indicates a need for the combination of world knowledge and weak contextual clues, which can hardly be learned from a small amount of manually annotated data. In this paper, we address this problem by augmenting the input text with external knowledge and context and by adopting a neural network model that can effectively handle the augmented text. Experiments show that external knowledge did improve the classification accuracy. Contextual information provided no significant gain for implicit discourse relations, but it did for explicit ones.
pdf
bib
Improving Crowdsourcing-Based Annotation of Japanese Discourse Relations
Yudai Kishimoto
|
Shinnosuke Sawada
|
Yugo Murawaki
|
Daisuke Kawahara
|
Sadao Kurohashi
Proceedings of the Eleventh International Conference on Language Resources and Evaluation (LREC 2018)
2014
pdf
bib
abs
Post-editing user interface using visualization of a sentence structure
Yudai Kishimoto
|
Toshiaki Nakazawa
|
Daisuke Kawahara
|
Sadao Kurohashi
Proceedings of the 11th Conference of the Association for Machine Translation in the Americas
Translation has become increasingly important by virtue of globalization. To reduce the cost of translation, it is necessary to use machine translation and further to take advantage of post-editing based on the result of a machine translation for accurate information dissemination. Such post-editing (e.g., PET [Aziz et al., 2012]) can be used practically for translation between European languages, which has a high performance in statistical machine translation. However, due to the low accuracy of machine translation between languages with different word order, such as Japanese-English and Japanese-Chinese, post-editing has not been used actively.