Transformer-based Lexically Constrained Headline Generation

Kosuke Yamada, Yuta Hitomi, Hideaki Tamori, Ryohei Sasano, Naoaki Okazaki, Kentaro Inui, Koichi Takeda


Abstract
This paper explores a variant of automatic headline generation methods, where a generated headline is required to include a given phrase such as a company or a product name. Previous methods using Transformer-based models generate a headline including a given phrase by providing the encoder with additional information corresponding to the given phrase. However, these methods cannot always include the phrase in the generated headline. Inspired by previous RNN-based methods generating token sequences in backward and forward directions from the given phrase, we propose a simple Transformer-based method that guarantees to include the given phrase in the high-quality generated headline. We also consider a new headline generation strategy that takes advantage of the controllable generation order of Transformer. Our experiments with the Japanese News Corpus demonstrate that our methods, which are guaranteed to include the phrase in the generated headline, achieve ROUGE scores comparable to previous Transformer-based methods. We also show that our generation strategy performs better than previous strategies.
Anthology ID:
2021.emnlp-main.335
Volume:
Proceedings of the 2021 Conference on Empirical Methods in Natural Language Processing
Month:
November
Year:
2021
Address:
Online and Punta Cana, Dominican Republic
Venue:
EMNLP
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
4085–4090
Language:
URL:
https://aclanthology.org/2021.emnlp-main.335
DOI:
10.18653/v1/2021.emnlp-main.335
Bibkey:
Cite (ACL):
Kosuke Yamada, Yuta Hitomi, Hideaki Tamori, Ryohei Sasano, Naoaki Okazaki, Kentaro Inui, and Koichi Takeda. 2021. Transformer-based Lexically Constrained Headline Generation. In Proceedings of the 2021 Conference on Empirical Methods in Natural Language Processing, pages 4085–4090, Online and Punta Cana, Dominican Republic. Association for Computational Linguistics.
Cite (Informal):
Transformer-based Lexically Constrained Headline Generation (Yamada et al., EMNLP 2021)
Copy Citation:
PDF:
https://aclanthology.org/2021.emnlp-main.335.pdf
Video:
 https://aclanthology.org/2021.emnlp-main.335.mp4
Code
 asahi-research/script-for-transformer-based-seq2bf
Data
JNC