%0 Conference Proceedings %T Transformer-based Lexically Constrained Headline Generation %A Yamada, Kosuke %A Hitomi, Yuta %A Tamori, Hideaki %A Sasano, Ryohei %A Okazaki, Naoaki %A Inui, Kentaro %A Takeda, Koichi %Y Moens, Marie-Francine %Y Huang, Xuanjing %Y Specia, Lucia %Y Yih, Scott Wen-tau %S Proceedings of the 2021 Conference on Empirical Methods in Natural Language Processing %D 2021 %8 November %I Association for Computational Linguistics %C Online and Punta Cana, Dominican Republic %F yamada-etal-2021-transformer %X This paper explores a variant of automatic headline generation methods, where a generated headline is required to include a given phrase such as a company or a product name. Previous methods using Transformer-based models generate a headline including a given phrase by providing the encoder with additional information corresponding to the given phrase. However, these methods cannot always include the phrase in the generated headline. Inspired by previous RNN-based methods generating token sequences in backward and forward directions from the given phrase, we propose a simple Transformer-based method that guarantees to include the given phrase in the high-quality generated headline. We also consider a new headline generation strategy that takes advantage of the controllable generation order of Transformer. Our experiments with the Japanese News Corpus demonstrate that our methods, which are guaranteed to include the phrase in the generated headline, achieve ROUGE scores comparable to previous Transformer-based methods. We also show that our generation strategy performs better than previous strategies. %R 10.18653/v1/2021.emnlp-main.335 %U https://aclanthology.org/2021.emnlp-main.335 %U https://doi.org/10.18653/v1/2021.emnlp-main.335 %P 4085-4090