Hirokazu Masataki


2018

pdf bib
Multi-task and Multi-lingual Joint Learning of Neural Lexical Utterance Classification based on Partially-shared Modeling
Ryo Masumura | Tomohiro Tanaka | Ryuichiro Higashinaka | Hirokazu Masataki | Yushi Aono
Proceedings of the 27th International Conference on Computational Linguistics

This paper is an initial study on multi-task and multi-lingual joint learning for lexical utterance classification. A major problem in constructing lexical utterance classification modules for spoken dialogue systems is that individual data resources are often limited or unbalanced among tasks and/or languages. Various studies have examined joint learning using neural-network based shared modeling; however, previous joint learning studies focused on either cross-task or cross-lingual knowledge transfer. In order to simultaneously support both multi-task and multi-lingual joint learning, our idea is to explicitly divide state-of-the-art neural lexical utterance classification into language-specific components that can be shared between different tasks and task-specific components that can be shared between different languages. In addition, in order to effectively transfer knowledge between different task data sets and different language data sets, this paper proposes a partially-shared modeling method that possesses both shared components and components specific to individual data sets. We demonstrate the effectiveness of proposed method using Japanese and English data sets with three different lexical utterance classification tasks.

2017

pdf bib
Hyperspherical Query Likelihood Models with Word Embeddings
Ryo Masumura | Taichi Asami | Hirokazu Masataki | Kugatsu Sadamitsu | Kyosuke Nishida | Ryuichiro Higashinaka
Proceedings of the Eighth International Joint Conference on Natural Language Processing (Volume 2: Short Papers)

This paper presents an initial study on hyperspherical query likelihood models (QLMs) for information retrieval (IR). Our motivation is to naturally utilize pre-trained word embeddings for probabilistic IR. To this end, key idea is to directly leverage the word embeddings as random variables for directional probabilistic models based on von Mises-Fisher distributions which are familiar to cosine distances. The proposed method enables us to theoretically take semantic similarities between document and target queries into consideration without introducing heuristic expansion techniques. In addition, this paper reveals relationships between hyperspherical QLMs and conventional QLMs. Experiments show document retrieval evaluation results in which a hyperspherical QLM is compared to conventional QLMs and document distance metrics using word or document embeddings.

2015

pdf bib
Hierarchical Latent Words Language Models for Robust Modeling to Out-Of Domain Tasks
Ryo Masumura | Taichi Asami | Takanobu Oba | Hirokazu Masataki | Sumitaka Sakauchi | Akinori Ito
Proceedings of the 2015 Conference on Empirical Methods in Natural Language Processing