Transformer versus LSTM Language Models trained on Uncertain ASR Hypotheses in Limited Data Scenarios Imran Sheikh author Emmanuel Vincent author Irina Illina author 2022-06 text Proceedings of the Thirteenth Language Resources and Evaluation Conference Nicoletta Calzolari editor Frédéric Béchet editor Philippe Blache editor Khalid Choukri editor Christopher Cieri editor Thierry Declerck editor Sara Goggi editor Hitoshi Isahara editor Bente Maegaard editor Joseph Mariani editor Hélène Mazo editor Jan Odijk editor Stelios Piperidis editor European Language Resources Association Marseille, France conference publication sheikh-etal-2022-transformer https://aclanthology.org/2022.lrec-1.41/ 2022-06 393 399