Leveraging Taxonomy and LLMs for Improved Multimodal Hierarchical Classification

Shijing Chen, Mohamed Reda Bouadjenek, Usman Naseem, Basem Suleiman, Shoaib Jameel, Flora Salim, Hakim Hacid, Imran Razzak


Abstract
Multi-level Hierarchical Classification (MLHC) tackles the challenge of categorizing items within a complex, multi-layered class structure. However, traditional MLHC classifiers often rely on a backbone model with n independent output layers, which tend to ignore the hierarchical relationships between classes. This oversight can lead to inconsistent predictions that violate the underlying taxonomy. Leveraging Large Language Models (LLMs), we propose novel taxonomy-embedded transitional LLM-agnostic framework for multimodality classification. The cornerstone of this advancement is the ability of models to enforce consistency across hierarchical levels. Our evaluations on the MEP-3M dataset - a Multi-modal E-commerce Product dataset with various hierarchical levels- demonstrated a significant performance improvement compared to conventional LLMs structure.
Anthology ID:
2025.coling-main.417
Volume:
Proceedings of the 31st International Conference on Computational Linguistics
Month:
January
Year:
2025
Address:
Abu Dhabi, UAE
Editors:
Owen Rambow, Leo Wanner, Marianna Apidianaki, Hend Al-Khalifa, Barbara Di Eugenio, Steven Schockaert
Venue:
COLING
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
6244–6254
Language:
URL:
https://aclanthology.org/2025.coling-main.417/
DOI:
Bibkey:
Cite (ACL):
Shijing Chen, Mohamed Reda Bouadjenek, Usman Naseem, Basem Suleiman, Shoaib Jameel, Flora Salim, Hakim Hacid, and Imran Razzak. 2025. Leveraging Taxonomy and LLMs for Improved Multimodal Hierarchical Classification. In Proceedings of the 31st International Conference on Computational Linguistics, pages 6244–6254, Abu Dhabi, UAE. Association for Computational Linguistics.
Cite (Informal):
Leveraging Taxonomy and LLMs for Improved Multimodal Hierarchical Classification (Chen et al., COLING 2025)
Copy Citation:
PDF:
https://aclanthology.org/2025.coling-main.417.pdf