Curricular Next Conversation Prediction Pretraining for Transcript Segmentation

Anvesh Rao Vijjini, Hanieh Deilamsalehy, Franck Dernoncourt, Snigdha Chaturvedi


Abstract
Transcript segmentation is the task of dividing a single continuous transcript into multiple segments. While document segmentation is a popular task, transcript segmentation has significant challenges due to the relatively noisy and sporadic nature of data. We propose pretraining strategies to address these challenges. The strategies are based on “Next Conversation Prediction” (NCP) with the underlying idea of pretraining a model to identify consecutive conversations. We further introduce “Advanced NCP” to make the pretraining task more relevant to the downstream task of segmentation break prediction while being significantly easier. Finally we introduce a curriculum to Advanced NCP (Curricular NCP) based on the similarity between pretraining and downstream task samples. Curricular NCP applied to a state-of-the-art model for text segmentation outperforms prior results. We also show that our pretraining strategies make the model robust to speech recognition errors commonly found in automatically generated transcripts.
Anthology ID:
2023.findings-eacl.197
Volume:
Findings of the Association for Computational Linguistics: EACL 2023
Month:
May
Year:
2023
Address:
Dubrovnik, Croatia
Editors:
Andreas Vlachos, Isabelle Augenstein
Venue:
Findings
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
2597–2607
Language:
URL:
https://aclanthology.org/2023.findings-eacl.197
DOI:
10.18653/v1/2023.findings-eacl.197
Bibkey:
Cite (ACL):
Anvesh Rao Vijjini, Hanieh Deilamsalehy, Franck Dernoncourt, and Snigdha Chaturvedi. 2023. Curricular Next Conversation Prediction Pretraining for Transcript Segmentation. In Findings of the Association for Computational Linguistics: EACL 2023, pages 2597–2607, Dubrovnik, Croatia. Association for Computational Linguistics.
Cite (Informal):
Curricular Next Conversation Prediction Pretraining for Transcript Segmentation (Vijjini et al., Findings 2023)
Copy Citation:
PDF:
https://aclanthology.org/2023.findings-eacl.197.pdf
Video:
 https://aclanthology.org/2023.findings-eacl.197.mp4