Attentively Embracing Noise for Robust Latent Representation in BERT

Gwenaelle Cunha Sergio, Dennis Singh Moirangthem, Minho Lee


Abstract
Modern digital personal assistants interact with users through voice. Therefore, they heavily rely on automatic speech recognition (ASR) in order to convert speech to text and perform further tasks. We introduce EBERT, which stands for EmbraceBERT, with the goal of extracting more robust latent representations for the task of noisy ASR text classification. Conventionally, BERT is fine-tuned for downstream classification tasks using only the [CLS] starter token, with the remaining tokens being discarded. We propose using all encoded transformer tokens and further encode them using a novel attentive embracement layer and multi-head attention layer. This approach uses the otherwise discarded tokens as a source of additional information and the multi-head attention in conjunction with the attentive embracement layer to select important features from clean data during training. This allows for the extraction of a robust latent vector resulting in improved classification performance during testing when presented with noisy inputs. We show the impact of our model on both the Chatbot and Snips corpora for intent classification with ASR error. Results, in terms of F1-score and mean between 10 runs, show that our model significantly outperforms the baseline model.
Anthology ID:
2020.coling-main.311
Volume:
Proceedings of the 28th International Conference on Computational Linguistics
Month:
December
Year:
2020
Address:
Barcelona, Spain (Online)
Editors:
Donia Scott, Nuria Bel, Chengqing Zong
Venue:
COLING
SIG:
Publisher:
International Committee on Computational Linguistics
Note:
Pages:
3479–3491
Language:
URL:
https://aclanthology.org/2020.coling-main.311
DOI:
10.18653/v1/2020.coling-main.311
Bibkey:
Cite (ACL):
Gwenaelle Cunha Sergio, Dennis Singh Moirangthem, and Minho Lee. 2020. Attentively Embracing Noise for Robust Latent Representation in BERT. In Proceedings of the 28th International Conference on Computational Linguistics, pages 3479–3491, Barcelona, Spain (Online). International Committee on Computational Linguistics.
Cite (Informal):
Attentively Embracing Noise for Robust Latent Representation in BERT (Cunha Sergio et al., COLING 2020)
Copy Citation:
PDF:
https://aclanthology.org/2020.coling-main.311.pdf
Code
 gcunhase/embracebert
Data
SNIPS