Incorporating Task-Specific Concept Knowledge into Script Learning

Chenkai Sun, Tie Xu, ChengXiang Zhai, Heng Ji


Abstract
In this paper, we present Tetris, a new task of Goal-Oriented Script Completion. Unlike previous work, it considers a more realistic and general setting, where the input includes not only the goal but also additional user context, including preferences and history. To address this problem, we propose a novel approach, which uses two techniques to improve performance: (1) concept prompting, and (2) script-oriented contrastive learning that addresses step repetition and hallucination problems. On our WikiHow-based dataset, we find that both methods improve performance.
Anthology ID:
2023.eacl-main.220
Volume:
Proceedings of the 17th Conference of the European Chapter of the Association for Computational Linguistics
Month:
May
Year:
2023
Address:
Dubrovnik, Croatia
Editors:
Andreas Vlachos, Isabelle Augenstein
Venue:
EACL
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
3026–3040
Language:
URL:
https://aclanthology.org/2023.eacl-main.220
DOI:
10.18653/v1/2023.eacl-main.220
Bibkey:
Cite (ACL):
Chenkai Sun, Tie Xu, ChengXiang Zhai, and Heng Ji. 2023. Incorporating Task-Specific Concept Knowledge into Script Learning. In Proceedings of the 17th Conference of the European Chapter of the Association for Computational Linguistics, pages 3026–3040, Dubrovnik, Croatia. Association for Computational Linguistics.
Cite (Informal):
Incorporating Task-Specific Concept Knowledge into Script Learning (Sun et al., EACL 2023)
Copy Citation:
PDF:
https://aclanthology.org/2023.eacl-main.220.pdf
Video:
 https://aclanthology.org/2023.eacl-main.220.mp4