Context or No Context? A preliminary exploration of human-in-the-loop approach for Incremental Temporal Summarization in meetings

Nicole Beckage, Shachi H Kumar, Saurav Sahay, Ramesh Manuvinakurike


Abstract
Incremental meeting temporal summarization, summarizing relevant information of partial multi-party meeting dialogue, is emerging as the next challenge in summarization research. Here we examine the extent to which human abstractive summaries of the preceding increments (context) can be combined with extractive meeting dialogue to generate abstractive summaries. We find that previous context improves ROUGE scores. Our findings further suggest that contexts begin to outweigh the dialogue. Using keyphrase extraction and semantic role labeling (SRL), we find that SRL captures relevant information without overwhelming the the model architecture. By compressing the previous contexts by ~70%, we achieve better ROUGE scores over our baseline models. Collectively, these results suggest that context matters, as does the way in which context is presented to the model.
Anthology ID:
2021.newsum-1.11
Volume:
Proceedings of the Third Workshop on New Frontiers in Summarization
Month:
November
Year:
2021
Address:
Online and in Dominican Republic
Venues:
EMNLP | newsum
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
96–106
Language:
URL:
https://aclanthology.org/2021.newsum-1.11
DOI:
10.18653/v1/2021.newsum-1.11
Bibkey:
Cite (ACL):
Nicole Beckage, Shachi H Kumar, Saurav Sahay, and Ramesh Manuvinakurike. 2021. Context or No Context? A preliminary exploration of human-in-the-loop approach for Incremental Temporal Summarization in meetings. In Proceedings of the Third Workshop on New Frontiers in Summarization, pages 96–106, Online and in Dominican Republic. Association for Computational Linguistics.
Cite (Informal):
Context or No Context? A preliminary exploration of human-in-the-loop approach for Incremental Temporal Summarization in meetings (Beckage et al., newsum 2021)
Copy Citation:
PDF:
https://aclanthology.org/2021.newsum-1.11.pdf
Data
CNN/Daily Mail