Towards Detecting Contextual Real-Time Toxicity for In-Game Chat

Zachary Yang, Nicolas Grenon-Godbout, Reihaneh Rabbany


Abstract
Real-time toxicity detection in online environments poses a significant challenge, due to the increasing prevalence of social media and gaming platforms. We introduce ToxBuster, a simple and scalable model that reliably detects toxic content in real-time for a line of chat by including chat history and metadata. ToxBuster consistently outperforms conventional toxicity models across popular multiplayer games, including Rainbow Six Siege, For Honor, and DOTA 2. We conduct an ablation study to assess the importance of each model component and explore ToxBuster’s transferability across the datasets. Furthermore, we showcase ToxBuster’s efficacy in post-game moderation, successfully flagging 82.1% of chat-reported players at a precision level of 90.0%. Additionally, we show how an additional 6% of unreported toxic players can be proactively moderated.
Anthology ID:
2023.findings-emnlp.663
Volume:
Findings of the Association for Computational Linguistics: EMNLP 2023
Month:
December
Year:
2023
Address:
Singapore
Editors:
Houda Bouamor, Juan Pino, Kalika Bali
Venue:
Findings
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
9894–9906
Language:
URL:
https://aclanthology.org/2023.findings-emnlp.663
DOI:
10.18653/v1/2023.findings-emnlp.663
Bibkey:
Cite (ACL):
Zachary Yang, Nicolas Grenon-Godbout, and Reihaneh Rabbany. 2023. Towards Detecting Contextual Real-Time Toxicity for In-Game Chat. In Findings of the Association for Computational Linguistics: EMNLP 2023, pages 9894–9906, Singapore. Association for Computational Linguistics.
Cite (Informal):
Towards Detecting Contextual Real-Time Toxicity for In-Game Chat (Yang et al., Findings 2023)
Copy Citation:
PDF:
https://aclanthology.org/2023.findings-emnlp.663.pdf