Even the Simplest Baseline Needs Careful Re-investigation: A Case Study on XML-CNN

Si-An Chen, Jie-jyun Liu, Tsung-Han Yang, Hsuan-Tien Lin, Chih-Jen Lin


Abstract
The power and the potential of deep learning models attract many researchers to design advanced and sophisticated architectures. Nevertheless, the progress is sometimes unreal due to various possible reasons. In this work, through an astonishing example we argue that more efforts should be paid to ensure the progress in developing a new deep learning method. For a highly influential multi-label text classification method XML-CNN, we show that the superior performance claimed in the original paper was mainly due to some unbelievable coincidences. We re-examine XML-CNN and make a re-implementation which reveals some contradictory findings to the claims in the original paper. Our study suggests suitable baselines for multi-label text classification tasks and confirms that the progress on a new architecture cannot be confidently justified without a cautious investigation.
Anthology ID:
2022.naacl-main.145
Volume:
Proceedings of the 2022 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies
Month:
July
Year:
2022
Address:
Seattle, United States
Venue:
NAACL
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
1987–2000
Language:
URL:
https://aclanthology.org/2022.naacl-main.145
DOI:
10.18653/v1/2022.naacl-main.145
Bibkey:
Cite (ACL):
Si-An Chen, Jie-jyun Liu, Tsung-Han Yang, Hsuan-Tien Lin, and Chih-Jen Lin. 2022. Even the Simplest Baseline Needs Careful Re-investigation: A Case Study on XML-CNN. In Proceedings of the 2022 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, pages 1987–2000, Seattle, United States. Association for Computational Linguistics.
Cite (Informal):
Even the Simplest Baseline Needs Careful Re-investigation: A Case Study on XML-CNN (Chen et al., NAACL 2022)
Copy Citation:
PDF:
https://aclanthology.org/2022.naacl-main.145.pdf
Software:
 2022.naacl-main.145.software.zip