Computationally Efficient Wasserstein Loss for Structured Labels

Ayato Toyokuni, Sho Yokoi, Hisashi Kashima, Makoto Yamada


Abstract
The problem of estimating the probability distribution of labels has been widely studied as a label distribution learning (LDL) problem, whose applications include age estimation, emotion analysis, and semantic segmentation. We propose a tree-Wasserstein distance regularized LDL algorithm, focusing on hierarchical text classification tasks. We propose predicting the entire label hierarchy using neural networks, where the similarity between predicted and true labels is measured using the tree-Wasserstein distance. Through experiments using synthetic and real-world datasets, we demonstrate that the proposed method successfully considers the structure of labels during training, and it compares favorably with the Sinkhorn algorithm in terms of computation time and memory usage.
Anthology ID:
2021.eacl-srw.1
Volume:
Proceedings of the 16th Conference of the European Chapter of the Association for Computational Linguistics: Student Research Workshop
Month:
April
Year:
2021
Address:
Online
Editors:
Ionut-Teodor Sorodoc, Madhumita Sushil, Ece Takmaz, Eneko Agirre
Venue:
EACL
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
1–7
Language:
URL:
https://aclanthology.org/2021.eacl-srw.1
DOI:
10.18653/v1/2021.eacl-srw.1
Bibkey:
Cite (ACL):
Ayato Toyokuni, Sho Yokoi, Hisashi Kashima, and Makoto Yamada. 2021. Computationally Efficient Wasserstein Loss for Structured Labels. In Proceedings of the 16th Conference of the European Chapter of the Association for Computational Linguistics: Student Research Workshop, pages 1–7, Online. Association for Computational Linguistics.
Cite (Informal):
Computationally Efficient Wasserstein Loss for Structured Labels (Toyokuni et al., EACL 2021)
Copy Citation:
PDF:
https://aclanthology.org/2021.eacl-srw.1.pdf