A Meta-framework for Spatiotemporal Quantity Extraction from Text

Qiang Ning, Ben Zhou, Hao Wu, Haoruo Peng, Chuchu Fan, Matt Gardner


Abstract
News events are often associated with quantities (e.g., the number of COVID-19 patients or the number of arrests in a protest), and it is often important to extract their type, time, and location from unstructured text in order to analyze these quantity events. This paper thus formulates the NLP problem of spatiotemporal quantity extraction, and proposes the first meta-framework for solving it. This meta-framework contains a formalism that decomposes the problem into several information extraction tasks, a shareable crowdsourcing pipeline, and transformer-based baseline models. We demonstrate the meta-framework in three domains—the COVID-19 pandemic, Black Lives Matter protests, and 2020 California wildfires—to show that the formalism is general and extensible, the crowdsourcing pipeline facilitates fast and high-quality data annotation, and the baseline system can handle spatiotemporal quantity extraction well enough to be practically useful. We release all resources for future research on this topic at https://github.com/steqe.
Anthology ID:
2022.acl-long.195
Volume:
Proceedings of the 60th Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers)
Month:
May
Year:
2022
Address:
Dublin, Ireland
Venue:
ACL
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
2736–2749
Language:
URL:
https://aclanthology.org/2022.acl-long.195
DOI:
10.18653/v1/2022.acl-long.195
Bibkey:
Cite (ACL):
Qiang Ning, Ben Zhou, Hao Wu, Haoruo Peng, Chuchu Fan, and Matt Gardner. 2022. A Meta-framework for Spatiotemporal Quantity Extraction from Text. In Proceedings of the 60th Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers), pages 2736–2749, Dublin, Ireland. Association for Computational Linguistics.
Cite (Informal):
A Meta-framework for Spatiotemporal Quantity Extraction from Text (Ning et al., ACL 2022)
Copy Citation:
PDF:
https://aclanthology.org/2022.acl-long.195.pdf
Video:
 https://aclanthology.org/2022.acl-long.195.mp4