CK-Transformer: Commonsense Knowledge Enhanced Transformers for Referring Expression Comprehension

Zhi Zhang, Helen Yannakoudakis, Xiantong Zhen, Ekaterina Shutova


Abstract
The task of multimodal referring expression comprehension (REC), aiming at localizing an image region described by a natural language expression, has recently received increasing attention within the research comminity. In this paper, we specifically focus on referring expression comprehension with commonsense knowledge (KB-Ref), a task which typically requires reasoning beyond spatial, visual or semantic information. We propose a novel framework for Commonsense Knowledge Enhanced Transformers (CK-Transformer) which effectively integrates commonsense knowledge into the representations of objects in an image, facilitating identification of the target objects referred to by the expressions. We conduct extensive experiments on several benchmarks for the task of KB-Ref. Our results show that the proposed CK-Transformer achieves a new state of the art, with an absolute improvement of 3.14% accuracy over the existing state of the art.
Anthology ID:
2023.findings-eacl.196
Volume:
Findings of the Association for Computational Linguistics: EACL 2023
Month:
May
Year:
2023
Address:
Dubrovnik, Croatia
Editors:
Andreas Vlachos, Isabelle Augenstein
Venue:
Findings
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
2586–2596
Language:
URL:
https://aclanthology.org/2023.findings-eacl.196
DOI:
10.18653/v1/2023.findings-eacl.196
Bibkey:
Cite (ACL):
Zhi Zhang, Helen Yannakoudakis, Xiantong Zhen, and Ekaterina Shutova. 2023. CK-Transformer: Commonsense Knowledge Enhanced Transformers for Referring Expression Comprehension. In Findings of the Association for Computational Linguistics: EACL 2023, pages 2586–2596, Dubrovnik, Croatia. Association for Computational Linguistics.
Cite (Informal):
CK-Transformer: Commonsense Knowledge Enhanced Transformers for Referring Expression Comprehension (Zhang et al., Findings 2023)
Copy Citation:
PDF:
https://aclanthology.org/2023.findings-eacl.196.pdf
Video:
 https://aclanthology.org/2023.findings-eacl.196.mp4