Words That Stick: Using Keyword Cohesion to Improve Text Segmentation

Amit Maraj, Miguel Vargas Martin, Masoud Makrehchi


Abstract
Text Segmentation (TS) is the idea of segmenting bodies of text into coherent blocks, mostly defined by the topics each segment contains. Historically, techniques in this area have been unsupervised, with more success recently coming from supervised methods instead. Although these approaches see better performance, they require training data and upfront training time. We propose a new method called Coherence, where we use strong sentence embeddings to pull representational keywords as the main constructor of sentences when comparing them to one another. Additionally, we include a storage of previously found keywords for the purposes of creating a more accurate segment representation instead of just the immediate sentence in question. With our system, we show improved results over current state-of-the-art unsupervised techniques when analyzed using Pk and WindowDiff scores. Because its unsupervised, Coherence requires no fine-tuning.
Anthology ID:
2024.conll-1.1
Volume:
Proceedings of the 28th Conference on Computational Natural Language Learning
Month:
November
Year:
2024
Address:
Miami, FL, USA
Editors:
Libby Barak, Malihe Alikhani
Venue:
CoNLL
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
1–9
Language:
URL:
https://aclanthology.org/2024.conll-1.1
DOI:
Bibkey:
Cite (ACL):
Amit Maraj, Miguel Vargas Martin, and Masoud Makrehchi. 2024. Words That Stick: Using Keyword Cohesion to Improve Text Segmentation. In Proceedings of the 28th Conference on Computational Natural Language Learning, pages 1–9, Miami, FL, USA. Association for Computational Linguistics.
Cite (Informal):
Words That Stick: Using Keyword Cohesion to Improve Text Segmentation (Maraj et al., CoNLL 2024)
Copy Citation:
PDF:
https://aclanthology.org/2024.conll-1.1.pdf