Tracing Influence at Scale: A Contrastive Learning Approach to Linking Public Comments and Regulator Responses

Linzi Xing, Brad Hackinen, Giuseppe Carenini


Abstract
U.S. Federal Regulators receive over one million comment letters each year from businesses, interest groups, and members of the public, all advocating for changes to proposed regulations. These comments are believed to have wide-ranging impacts on public policy. However, measuring the impact of specific comments is challenging because regulators are required to respond to comments but they do not have to specify which comments they are addressing. In this paper, we propose a simple yet effective solution to this problem by using an iterative contrastive method to train a neural model aiming for matching text from public comments to responses written by regulators. We demonstrate that our proposal substantially outperforms a set of selected text-matching baselines on a human-annotated test set. Furthermore, it delivers performance comparable to the most advanced gigantic language model (i.e., GPT-4), and is more cost-effective when handling comments and regulator responses matching in larger scale.
Anthology ID:
2023.nllp-1.26
Volume:
Proceedings of the Natural Legal Language Processing Workshop 2023
Month:
December
Year:
2023
Address:
Singapore
Editors:
Daniel Preoțiuc-Pietro, Catalina Goanta, Ilias Chalkidis, Leslie Barrett, Gerasimos (Jerry) Spanakis, Nikolaos Aletras
Venues:
NLLP | WS
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
266–274
Language:
URL:
https://aclanthology.org/2023.nllp-1.26
DOI:
10.18653/v1/2023.nllp-1.26
Bibkey:
Cite (ACL):
Linzi Xing, Brad Hackinen, and Giuseppe Carenini. 2023. Tracing Influence at Scale: A Contrastive Learning Approach to Linking Public Comments and Regulator Responses. In Proceedings of the Natural Legal Language Processing Workshop 2023, pages 266–274, Singapore. Association for Computational Linguistics.
Cite (Informal):
Tracing Influence at Scale: A Contrastive Learning Approach to Linking Public Comments and Regulator Responses (Xing et al., NLLP-WS 2023)
Copy Citation:
PDF:
https://aclanthology.org/2023.nllp-1.26.pdf