Prototype-Based Interpretability for Legal Citation Prediction

Chu Fei Luo, Rohan Bhambhoria, Samuel Dahan, Xiaodan Zhu


Abstract
Deep learning has made significant progress in the past decade, and demonstrates potential to solve problems with extensive social impact. In high-stakes decision making areas such as law, experts often require interpretability for automatic systems to be utilized in practical settings. In this work, we attempt to address these requirements applied to the important problem of legal citation prediction (LCP). We design the task with parallels to the thought-process of lawyers, i.e., with reference to both precedents and legislative provisions. After initial experimental results, we refine the target citation predictions with the feedback of legal experts. Additionally, we introduce a prototype architecture to add interpretability, achieving strong performance while adhering to decision parameters used by lawyers. Our study builds on and leverages the state-of-the-art language processing models for law, while addressing vital considerations for high-stakes tasks with practical societal impact.
Anthology ID:
2023.findings-acl.301
Volume:
Findings of the Association for Computational Linguistics: ACL 2023
Month:
July
Year:
2023
Address:
Toronto, Canada
Editors:
Anna Rogers, Jordan Boyd-Graber, Naoaki Okazaki
Venue:
Findings
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
4883–4898
Language:
URL:
https://aclanthology.org/2023.findings-acl.301
DOI:
10.18653/v1/2023.findings-acl.301
Bibkey:
Cite (ACL):
Chu Fei Luo, Rohan Bhambhoria, Samuel Dahan, and Xiaodan Zhu. 2023. Prototype-Based Interpretability for Legal Citation Prediction. In Findings of the Association for Computational Linguistics: ACL 2023, pages 4883–4898, Toronto, Canada. Association for Computational Linguistics.
Cite (Informal):
Prototype-Based Interpretability for Legal Citation Prediction (Luo et al., Findings 2023)
Copy Citation:
PDF:
https://aclanthology.org/2023.findings-acl.301.pdf
Video:
 https://aclanthology.org/2023.findings-acl.301.mp4