STAR: SocioTechnical Approach to Red Teaming Language Models

Laura Weidinger, John F J Mellor, Bernat Guillén Pegueroles, Nahema Marchal, Ravin Kumar, Kristian Lum, Canfer Akbulut, Mark Diaz, A. Stevie Bergman, Mikel D. Rodriguez, Verena Rieser, William Isaac


Abstract
This research introduces STAR, a sociotechnical framework that improves on current best practices for red teaming safety of large language models. STAR makes two key contributions: it enhances steerability by generating parameterised instructions for human red teamers, leading to improved coverage of the risk surface. Parameterised instructions also provide more detailed insights into model failures at no increased cost. Second, STAR improves signal quality by matching demographics to assess harms for specific groups, resulting in more sensitive annotations. STAR further employs a novel step of arbitration to leverage diverse viewpoints and improve label reliability, treating disagreement not as noise but as a valuable contribution to signal quality.
Anthology ID:
2024.emnlp-main.1200
Volume:
Proceedings of the 2024 Conference on Empirical Methods in Natural Language Processing
Month:
November
Year:
2024
Address:
Miami, Florida, USA
Editors:
Yaser Al-Onaizan, Mohit Bansal, Yun-Nung Chen
Venue:
EMNLP
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
21516–21532
Language:
URL:
https://aclanthology.org/2024.emnlp-main.1200
DOI:
10.18653/v1/2024.emnlp-main.1200
Bibkey:
Cite (ACL):
Laura Weidinger, John F J Mellor, Bernat Guillén Pegueroles, Nahema Marchal, Ravin Kumar, Kristian Lum, Canfer Akbulut, Mark Diaz, A. Stevie Bergman, Mikel D. Rodriguez, Verena Rieser, and William Isaac. 2024. STAR: SocioTechnical Approach to Red Teaming Language Models. In Proceedings of the 2024 Conference on Empirical Methods in Natural Language Processing, pages 21516–21532, Miami, Florida, USA. Association for Computational Linguistics.
Cite (Informal):
STAR: SocioTechnical Approach to Red Teaming Language Models (Weidinger et al., EMNLP 2024)
Copy Citation:
PDF:
https://aclanthology.org/2024.emnlp-main.1200.pdf