Yongkun Wang
2018
Learning What to Share: Leaky Multi-Task Network for Text Classification
Liqiang Xiao
|
Honglun Zhang
|
Wenqing Chen
|
Yongkun Wang
|
Yaohui Jin
Proceedings of the 27th International Conference on Computational Linguistics
Neural network based multi-task learning has achieved great success on many NLP problems, which focuses on sharing knowledge among tasks by linking some layers to enhance the performance. However, most existing approaches suffer from the interference between tasks because they lack of selection mechanism for feature sharing. In this way, the feature spaces of tasks may be easily contaminated by helpless features borrowed from others, which will confuse the models for making correct prediction. In this paper, we propose a multi-task convolutional neural network with the Leaky Unit, which has memory and forgetting mechanism to filter the feature flows between tasks. Experiments on five different datasets for text classification validate the benefits of our approach.
Multi-Task Label Embedding for Text Classification
Honglun Zhang
|
Liqiang Xiao
|
Wenqing Chen
|
Yongkun Wang
|
Yaohui Jin
Proceedings of the 2018 Conference on Empirical Methods in Natural Language Processing
Multi-task learning in text classification leverages implicit correlations among related tasks to extract common features and yield performance gains. However, a large body of previous work treats labels of each task as independent and meaningless one-hot vectors, which cause a loss of potential label information. In this paper, we propose Multi-Task Label Embedding to convert labels in text classification into semantic vectors, thereby turning the original tasks into vector matching tasks. Our model utilizes semantic correlations among tasks and makes it convenient to scale or transfer when new tasks are involved. Extensive experiments on five benchmark datasets for text classification show that our model can effectively improve the performances of related tasks with semantic representations of labels and additional information from each other.
MCapsNet: Capsule Network for Text with Multi-Task Learning
Liqiang Xiao
|
Honglun Zhang
|
Wenqing Chen
|
Yongkun Wang
|
Yaohui Jin
Proceedings of the 2018 Conference on Empirical Methods in Natural Language Processing
Multi-task learning has an ability to share the knowledge among related tasks and implicitly increase the training data. However, it has long been frustrated by the interference among tasks. This paper investigates the performance of capsule network for text, and proposes a capsule-based multi-task learning architecture, which is unified, simple and effective. With the advantages of capsules for feature clustering, proposed task routing algorithm can cluster the features for each task in the network, which helps reduce the interference among tasks. Experiments on six text classification datasets demonstrate the effectiveness of our models and their characteristics for feature clustering.
Search