%0 Conference Proceedings %T Balancing Methods for Multi-label Text Classification with Long-Tailed Class Distribution %A Huang, Yi %A Giledereli, Buse %A Köksal, Abdullatif %A Özgür, Arzucan %A Ozkirimli, Elif %Y Moens, Marie-Francine %Y Huang, Xuanjing %Y Specia, Lucia %Y Yih, Scott Wen-tau %S Proceedings of the 2021 Conference on Empirical Methods in Natural Language Processing %D 2021 %8 November %I Association for Computational Linguistics %C Online and Punta Cana, Dominican Republic %F huang-etal-2021-balancing %X Multi-label text classification is a challenging task because it requires capturing label dependencies. It becomes even more challenging when class distribution is long-tailed. Resampling and re-weighting are common approaches used for addressing the class imbalance problem, however, they are not effective when there is label dependency besides class imbalance because they result in oversampling of common labels. Here, we introduce the application of balancing loss functions for multi-label text classification. We perform experiments on a general domain dataset with 90 labels (Reuters-21578) and a domain-specific dataset from PubMed with 18211 labels. We find that a distribution-balanced loss function, which inherently addresses both the class imbalance and label linkage problems, outperforms commonly used loss functions. Distribution balancing methods have been successfully used in the image recognition field. Here, we show their effectiveness in natural language processing. Source code is available at https://github.com/blessu/BalancedLossNLP. %R 10.18653/v1/2021.emnlp-main.643 %U https://aclanthology.org/2021.emnlp-main.643 %U https://doi.org/10.18653/v1/2021.emnlp-main.643 %P 8153-8161