RED-ACE: Robust Error Detection for ASR using Confidence Embeddings

Zorik Gekhman, Dina Zverinski, Jonathan Mallinson, Genady Beryozkin


Abstract
ASR Error Detection (AED) models aim to post-process the output of Automatic Speech Recognition (ASR) systems, in order to detect transcription errors. Modern approaches usually use text-based input, comprised solely of the ASR transcription hypothesis, disregarding additional signals from the ASR model. Instead, we utilize the ASR system’s word-level confidence scores for improving AED performance. Specifically, we add an ASR Confidence Embedding (ACE) layer to the AED model’s encoder, allowing us to jointly encode the confidence scores and the transcribed text into a contextualized representation. Our experiments show the benefits of ASR confidence scores for AED, their complementary effect over the textual signal, as well as the effectiveness and robustness of ACE for combining these signals. To foster further research, we publish a novel AED dataset consisting of ASR outputs on the LibriSpeech corpus with annotated transcription errors.
Anthology ID:
2022.emnlp-main.180
Volume:
Proceedings of the 2022 Conference on Empirical Methods in Natural Language Processing
Month:
December
Year:
2022
Address:
Abu Dhabi, United Arab Emirates
Editors:
Yoav Goldberg, Zornitsa Kozareva, Yue Zhang
Venue:
EMNLP
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
2800–2808
Language:
URL:
https://aclanthology.org/2022.emnlp-main.180
DOI:
10.18653/v1/2022.emnlp-main.180
Bibkey:
Cite (ACL):
Zorik Gekhman, Dina Zverinski, Jonathan Mallinson, and Genady Beryozkin. 2022. RED-ACE: Robust Error Detection for ASR using Confidence Embeddings. In Proceedings of the 2022 Conference on Empirical Methods in Natural Language Processing, pages 2800–2808, Abu Dhabi, United Arab Emirates. Association for Computational Linguistics.
Cite (Informal):
RED-ACE: Robust Error Detection for ASR using Confidence Embeddings (Gekhman et al., EMNLP 2022)
Copy Citation:
PDF:
https://aclanthology.org/2022.emnlp-main.180.pdf