ModSCAN: Measuring Stereotypical Bias in Large Vision-Language Models from Vision and Language Modalities

Yukun Jiang, Zheng Li, Xinyue Shen, Yugeng Liu, Michael Backes, Yang Zhang


Anthology ID:
2024.emnlp-main.713
Volume:
Proceedings of the 2024 Conference on Empirical Methods in Natural Language Processing
Month:
November
Year:
2024
Address:
Miami, Florida, USA
Editors:
Yaser Al-Onaizan, Mohit Bansal, Yun-Nung Chen
Venue:
EMNLP
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
12814–12845
Language:
URL:
https://aclanthology.org/2024.emnlp-main.713
DOI:
Bibkey:
Cite (ACL):
Yukun Jiang, Zheng Li, Xinyue Shen, Yugeng Liu, Michael Backes, and Yang Zhang. 2024. ModSCAN: Measuring Stereotypical Bias in Large Vision-Language Models from Vision and Language Modalities. In Proceedings of the 2024 Conference on Empirical Methods in Natural Language Processing, pages 12814–12845, Miami, Florida, USA. Association for Computational Linguistics.
Cite (Informal):
ModSCAN: Measuring Stereotypical Bias in Large Vision-Language Models from Vision and Language Modalities (Jiang et al., EMNLP 2024)
Copy Citation:
PDF:
https://aclanthology.org/2024.emnlp-main.713.pdf