On the Transferability of Visually Grounded PCFGs

Yanpeng Zhao, Ivan Titov


Abstract
There has been a significant surge of interest in visually grounded grammar induction in recent times. While a variety of models have been developed for the task and have demonstrated impressive performance, they have not been evaluated on text domains that are different from the training domain, so it is unclear if the improvements brought by visual groundings are transferable. Our study aims to fill this gap and assess the degree of transferability. We start by extending VC-PCFG (short for Visually-grounded Compound PCFG [[Zhao and Titov, 2020](https://aclanthology.org/2020.emnlp-main.354/)]) in such a way that it can transfer across text domains. We consider a zero-shot transfer learning setting where a model is trained on the source domain and is directly applied to target domains, without any further training. Our experimental results suggest that: the benefits from using visual groundings transfer to text in a domain similar to the training domain but fail to transfer to remote domains. Further, we conduct data and result analysis; we find that the lexicon overlap between the source domain and the target domain is the most important factor in the transferability of VC-PCFG.
Anthology ID:
2023.findings-emnlp.530
Volume:
Findings of the Association for Computational Linguistics: EMNLP 2023
Month:
December
Year:
2023
Address:
Singapore
Editors:
Houda Bouamor, Juan Pino, Kalika Bali
Venue:
Findings
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
7895–7910
Language:
URL:
https://aclanthology.org/2023.findings-emnlp.530
DOI:
10.18653/v1/2023.findings-emnlp.530
Bibkey:
Cite (ACL):
Yanpeng Zhao and Ivan Titov. 2023. On the Transferability of Visually Grounded PCFGs. In Findings of the Association for Computational Linguistics: EMNLP 2023, pages 7895–7910, Singapore. Association for Computational Linguistics.
Cite (Informal):
On the Transferability of Visually Grounded PCFGs (Zhao & Titov, Findings 2023)
Copy Citation:
PDF:
https://aclanthology.org/2023.findings-emnlp.530.pdf