Did You Mean...? Confidence-based Trade-offs in Semantic Parsing

Elias Stengel-Eskin, Benjamin Van Durme


Abstract
We illustrate how a calibrated model can help balance common trade-offs in task-oriented parsing. In a simulated annotator-in-the-loop experiment, we show that well-calibrated confidence scores allow us to balance cost with annotator load, improving accuracy with a small number of interactions. We then examine how confidence scores can help optimize the trade-off between usability and safety. We show that confidence-based thresholding can substantially reduce the number of incorrect low-confidence programs executed; however, this comes at a cost to usability. We propose the DidYouMean system which better balances usability and safety by rephrasing low-confidence inputs.
Anthology ID:
2023.emnlp-main.159
Volume:
Proceedings of the 2023 Conference on Empirical Methods in Natural Language Processing
Month:
December
Year:
2023
Address:
Singapore
Editors:
Houda Bouamor, Juan Pino, Kalika Bali
Venue:
EMNLP
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
2621–2629
Language:
URL:
https://aclanthology.org/2023.emnlp-main.159
DOI:
10.18653/v1/2023.emnlp-main.159
Bibkey:
Cite (ACL):
Elias Stengel-Eskin and Benjamin Van Durme. 2023. Did You Mean...? Confidence-based Trade-offs in Semantic Parsing. In Proceedings of the 2023 Conference on Empirical Methods in Natural Language Processing, pages 2621–2629, Singapore. Association for Computational Linguistics.
Cite (Informal):
Did You Mean…? Confidence-based Trade-offs in Semantic Parsing (Stengel-Eskin & Van Durme, EMNLP 2023)
Copy Citation:
PDF:
https://aclanthology.org/2023.emnlp-main.159.pdf
Video:
 https://aclanthology.org/2023.emnlp-main.159.mp4