Understanding Retrieval Robustness for Retrieval-augmented Image Captioning

Wenyan Li, Jiaang Li, Rita Ramos, Raphael Tang, Desmond Elliott


Abstract
Recent advances in retrieval-augmented models for image captioning highlight the benefit of retrieving related captions for efficient, lightweight models with strong domain-transfer capabilities. While these models demonstrate the success of retrieval augmentation, retrieval models are still far from perfect in practice: the retrieved information can sometimes mislead the model, resulting in incorrect generation and worse performance. In this paper, we analyze the robustness of a retrieval-augmented captioning model SmallCap. Our analysis shows that the model is sensitive to tokens that appear in the majority of the retrieved captions, and the input attribution shows that those tokens are likely copied into the generated output. Given these findings, we propose to train the model by sampling retrieved captions from more diverse sets. This decreases the chance that the model learns to copy majority tokens, and improves both in-domain and cross-domain performance.
Anthology ID:
2024.acl-long.503
Volume:
Proceedings of the 62nd Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers)
Month:
August
Year:
2024
Address:
Bangkok, Thailand
Editors:
Lun-Wei Ku, Andre Martins, Vivek Srikumar
Venue:
ACL
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
9285–9299
Language:
URL:
https://aclanthology.org/2024.acl-long.503
DOI:
10.18653/v1/2024.acl-long.503
Bibkey:
Cite (ACL):
Wenyan Li, Jiaang Li, Rita Ramos, Raphael Tang, and Desmond Elliott. 2024. Understanding Retrieval Robustness for Retrieval-augmented Image Captioning. In Proceedings of the 62nd Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers), pages 9285–9299, Bangkok, Thailand. Association for Computational Linguistics.
Cite (Informal):
Understanding Retrieval Robustness for Retrieval-augmented Image Captioning (Li et al., ACL 2024)
Copy Citation:
PDF:
https://aclanthology.org/2024.acl-long.503.pdf