Coconut: Contextualized Commonsense Unified Transformers for Graph-Based Commonsense Augmentation of Language Models

Jun-Hyung Park, Mingyu Lee, Junho Kim, SangKeun Lee


Abstract
In this paper, we introduce COCONUT to effectively guide the contextualization of structured commonsense knowledge based on largelanguage models. COCONUT employs a contextualized knowledge prompting scheme to gather high-quality contextualization examplesfrom a large language model. These examples are subsequently distilled into small language models to enhance their contextualization capability. Extensive evaluations show that COCONUT considerably improves commonsense reasoning performance across diverse benchmarks, models, and settings, exhibiting its flexibility and universality in generating contextualized commonsense knowledge. Notably,COCONUT consistently outperforms the state-of-the-art technique by an average of 5.8%.
Anthology ID:
2024.findings-acl.346
Volume:
Findings of the Association for Computational Linguistics ACL 2024
Month:
August
Year:
2024
Address:
Bangkok, Thailand and virtual meeting
Editors:
Lun-Wei Ku, Andre Martins, Vivek Srikumar
Venue:
Findings
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
5815–5830
Language:
URL:
https://aclanthology.org/2024.findings-acl.346
DOI:
Bibkey:
Cite (ACL):
Jun-Hyung Park, Mingyu Lee, Junho Kim, and SangKeun Lee. 2024. Coconut: Contextualized Commonsense Unified Transformers for Graph-Based Commonsense Augmentation of Language Models. In Findings of the Association for Computational Linguistics ACL 2024, pages 5815–5830, Bangkok, Thailand and virtual meeting. Association for Computational Linguistics.
Cite (Informal):
Coconut: Contextualized Commonsense Unified Transformers for Graph-Based Commonsense Augmentation of Language Models (Park et al., Findings 2024)
Copy Citation:
PDF:
https://aclanthology.org/2024.findings-acl.346.pdf