Multi-TimeLine Summarization (MTLS): Improving Timeline Summarization by Generating Multiple Summaries

Yi Yu, Adam Jatowt, Antoine Doucet, Kazunari Sugiyama, Masatoshi Yoshikawa


Abstract
In this paper, we address a novel task, Multiple TimeLine Summarization (MTLS), which extends the flexibility and versatility of Time-Line Summarization (TLS). Given any collection of time-stamped news articles, MTLS automatically discovers important yet different stories and generates a corresponding time-line for each story.To achieve this, we propose a novel unsupervised summarization framework based on two-stage affinity propagation. We also introduce a quantitative evaluation measure for MTLS based on previousTLS evaluation methods. Experimental results show that our MTLS framework demonstrates high effectiveness and MTLS task can give bet-ter results than TLS.
Anthology ID:
2021.acl-long.32
Volume:
Proceedings of the 59th Annual Meeting of the Association for Computational Linguistics and the 11th International Joint Conference on Natural Language Processing (Volume 1: Long Papers)
Month:
August
Year:
2021
Address:
Online
Venues:
ACL | IJCNLP
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
377–387
Language:
URL:
https://aclanthology.org/2021.acl-long.32
DOI:
10.18653/v1/2021.acl-long.32
Bibkey:
Copy Citation:
PDF:
https://aclanthology.org/2021.acl-long.32.pdf
Optional supplementary material:
 2021.acl-long.32.OptionalSupplementaryMaterial.zip