SBAAM! Eliminating Transcript Dependency in Automatic Subtitling

Marco Gaido, Sara Papi, Matteo Negri, Mauro Cettolo, Luisa Bentivogli


Abstract
Subtitling plays a crucial role in enhancing the accessibility of audiovisual content and encompasses three primary subtasks: translating spoken dialogue, segmenting translations into concise textual units, and estimating timestamps that govern their on-screen duration. Past attempts to automate this process rely, to varying degrees, on automatic transcripts, employed diversely for the three subtasks. In response to the acknowledged limitations associated with this reliance on transcripts, recent research has shifted towards transcription-free solutions for translation and segmentation, leaving the direct generation of timestamps as uncharted territory. To fill this gap, we introduce the first direct model capable of producing automatic subtitles, entirely eliminating any dependence on intermediate transcripts also for timestamp prediction. Experimental results, backed by manual evaluation, showcase our solution’s new state-of-the-art performance across multiple language pairs and diverse conditions.
Anthology ID:
2024.acl-long.201
Volume:
Proceedings of the 62nd Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers)
Month:
August
Year:
2024
Address:
Bangkok, Thailand
Editors:
Lun-Wei Ku, Andre Martins, Vivek Srikumar
Venue:
ACL
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
3673–3691
Language:
URL:
https://aclanthology.org/2024.acl-long.201
DOI:
Bibkey:
Cite (ACL):
Marco Gaido, Sara Papi, Matteo Negri, Mauro Cettolo, and Luisa Bentivogli. 2024. SBAAM! Eliminating Transcript Dependency in Automatic Subtitling. In Proceedings of the 62nd Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers), pages 3673–3691, Bangkok, Thailand. Association for Computational Linguistics.
Cite (Informal):
SBAAM! Eliminating Transcript Dependency in Automatic Subtitling (Gaido et al., ACL 2024)
Copy Citation:
PDF:
https://aclanthology.org/2024.acl-long.201.pdf