ClimateBERT-NetZero: Detecting and Assessing Net Zero and Reduction Targets

Tobias Schimanski, Julia Bingler, Mathias Kraus, Camilla Hyslop, Markus Leippold


Abstract
Public and private actors struggle to assess the vast amounts of information about sustainability commitments made by various institutions. To address this problem, we create a novel tool for automatically detecting corporate and national net zero and reduction targets in three steps. First, we introduce an expert-annotated data set with 3.5K text samples. Second, we train and release ClimateBERT-NetZero, a natural language classifier to detect whether a text contains a net zero or reduction target. Third, we showcase its analysis potential with two use cases: We first demonstrate how ClimateBERT-NetZero can be combined with conventional question-answering (Q&A) models to analyze the ambitions displayed in net zero and reduction targets. Furthermore, we employ the ClimateBERT-NetZero model on quarterly earning call transcripts and outline how communication patterns evolve over time. Our experiments demonstrate promising pathways for extracting and analyzing net zero and emission reduction targets at scale.
Anthology ID:
2023.emnlp-main.975
Volume:
Proceedings of the 2023 Conference on Empirical Methods in Natural Language Processing
Month:
December
Year:
2023
Address:
Singapore
Editors:
Houda Bouamor, Juan Pino, Kalika Bali
Venue:
EMNLP
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
15745–15756
Language:
URL:
https://aclanthology.org/2023.emnlp-main.975
DOI:
10.18653/v1/2023.emnlp-main.975
Bibkey:
Cite (ACL):
Tobias Schimanski, Julia Bingler, Mathias Kraus, Camilla Hyslop, and Markus Leippold. 2023. ClimateBERT-NetZero: Detecting and Assessing Net Zero and Reduction Targets. In Proceedings of the 2023 Conference on Empirical Methods in Natural Language Processing, pages 15745–15756, Singapore. Association for Computational Linguistics.
Cite (Informal):
ClimateBERT-NetZero: Detecting and Assessing Net Zero and Reduction Targets (Schimanski et al., EMNLP 2023)
Copy Citation:
PDF:
https://aclanthology.org/2023.emnlp-main.975.pdf
Video:
 https://aclanthology.org/2023.emnlp-main.975.mp4