The Amazing World of Neural Language Generation

Yangfeng Ji, Antoine Bosselut, Thomas Wolf, Asli Celikyilmaz


Abstract
Neural Language Generation (NLG) – using neural network models to generate coherent text – is among the most promising methods for automated text creation. Recent years have seen a paradigm shift in neural text generation, caused by the advances in deep contextual language modeling (e.g., LSTMs, GPT, GPT2) and transfer learning (e.g., ELMo, BERT). While these tools have dramatically improved the state of NLG, particularly for low resources tasks, state-of-the-art NLG models still face many challenges: a lack of diversity in generated text, commonsense violations in depicted situations, difficulties in making use of factual information, and difficulties in designing reliable evaluation metrics. In this tutorial, we will present an overview of the current state-of-the-art in neural network architectures, and how they shaped recent research directions in text generation. We will discuss how and why these models succeed/fail at generating coherent text, and provide insights on several applications.
Anthology ID:
2020.emnlp-tutorials.7
Volume:
Proceedings of the 2020 Conference on Empirical Methods in Natural Language Processing: Tutorial Abstracts
Month:
November
Year:
2020
Address:
Online
Editors:
Aline Villavicencio, Benjamin Van Durme
Venue:
EMNLP
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
37–42
Language:
URL:
https://aclanthology.org/2020.emnlp-tutorials.7
DOI:
10.18653/v1/2020.emnlp-tutorials.7
Bibkey:
Cite (ACL):
Yangfeng Ji, Antoine Bosselut, Thomas Wolf, and Asli Celikyilmaz. 2020. The Amazing World of Neural Language Generation. In Proceedings of the 2020 Conference on Empirical Methods in Natural Language Processing: Tutorial Abstracts, pages 37–42, Online. Association for Computational Linguistics.
Cite (Informal):
The Amazing World of Neural Language Generation (Ji et al., EMNLP 2020)
Copy Citation:
PDF:
https://aclanthology.org/2020.emnlp-tutorials.7.pdf