A Case Study on Neural Headline Generation for Editing Support

Kazuma Murao, Ken Kobayashi, Hayato Kobayashi, Taichi Yatsuka, Takeshi Masuyama, Tatsuru Higurashi, Yoshimune Tabuchi


Abstract
There have been many studies on neural headline generation models trained with a lot of (article, headline) pairs. However, there are few situations for putting such models into practical use in the real world since news articles typically already have corresponding headlines. In this paper, we describe a practical use case of neural headline generation in a news aggregator, where dozens of professional editors constantly select important news articles and manually create their headlines, which are much shorter than the original headlines. Specifically, we show how to deploy our model to an editing support tool and report the results of comparing the behavior of the editors before and after the release.
Anthology ID:
N19-2010
Volume:
Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 2 (Industry Papers)
Month:
June
Year:
2019
Address:
Minneapolis, Minnesota
Editors:
Anastassia Loukina, Michelle Morales, Rohit Kumar
Venue:
NAACL
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
73–82
Language:
URL:
https://aclanthology.org/N19-2010
DOI:
10.18653/v1/N19-2010
Bibkey:
Cite (ACL):
Kazuma Murao, Ken Kobayashi, Hayato Kobayashi, Taichi Yatsuka, Takeshi Masuyama, Tatsuru Higurashi, and Yoshimune Tabuchi. 2019. A Case Study on Neural Headline Generation for Editing Support. In Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 2 (Industry Papers), pages 73–82, Minneapolis, Minnesota. Association for Computational Linguistics.
Cite (Informal):
A Case Study on Neural Headline Generation for Editing Support (Murao et al., NAACL 2019)
Copy Citation:
PDF:
https://aclanthology.org/N19-2010.pdf
Presentation:
 N19-2010.Presentation.pdf