Collaboration or Corporate Capture? Quantifying NLP’s Reliance on Industry Artifacts and Contributions

Will Aitken, Mohamed Abdalla, Karen Rudie, Catherine Stinson


Abstract
Impressive performance of pre-trained models has garnered public attention and made news headlines in recent years. Almost always, these models are produced by or in collaboration with industry. Using them is critical for competing on natural language processing (NLP) benchmarks and correspondingly to stay relevant in NLP research. We surveyed 100 papers published at EMNLP 2022 to determine the degree to which researchers rely on industry models, other artifacts, and contributions to publish in prestigious NLP venues and found that the ratio of their citation is at least three times greater than what would be expected. Our work serves as a scaffold to enable future researchers to more accurately address whether: 1) Collaboration with industry is still collaboration in the absence of an alternative or 2) if NLP inquiry has been captured by the motivations and research direction of private corporations.
Anthology ID:
2024.acl-long.188
Volume:
Proceedings of the 62nd Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers)
Month:
August
Year:
2024
Address:
Bangkok, Thailand
Editors:
Lun-Wei Ku, Andre Martins, Vivek Srikumar
Venue:
ACL
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
3433–3448
Language:
URL:
https://aclanthology.org/2024.acl-long.188
DOI:
Bibkey:
Cite (ACL):
Will Aitken, Mohamed Abdalla, Karen Rudie, and Catherine Stinson. 2024. Collaboration or Corporate Capture? Quantifying NLP’s Reliance on Industry Artifacts and Contributions. In Proceedings of the 62nd Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers), pages 3433–3448, Bangkok, Thailand. Association for Computational Linguistics.
Cite (Informal):
Collaboration or Corporate Capture? Quantifying NLP’s Reliance on Industry Artifacts and Contributions (Aitken et al., ACL 2024)
Copy Citation:
PDF:
https://aclanthology.org/2024.acl-long.188.pdf