%0 Conference Proceedings %T Knowledge-Enriched Natural Language Generation %A Yu, Wenhao %A Jiang, Meng %A Hu, Zhiting %A Wang, Qingyun %A Ji, Heng %A Rajani, Nazneen %Y Jiang, Jing %Y Vulić, Ivan %S Proceedings of the 2021 Conference on Empirical Methods in Natural Language Processing: Tutorial Abstracts %D 2021 %8 November %I Association for Computational Linguistics %C Punta Cana, Dominican Republic & Online %F yu-etal-2021-knowledge %X Knowledge-enriched text generation poses unique challenges in modeling and learning, driving active research in several core directions, ranging from integrated modeling of neural representations and symbolic information in the sequential/hierarchical/graphical structures, learning without direct supervisions due to the cost of structured annotation, efficient optimization and inference with massive and global constraints, to language grounding on multiple modalities, and generative reasoning with implicit commonsense knowledge and background knowledge. In this tutorial we will present a roadmap to line up the state-of-the-art methods to tackle these challenges on this cutting-edge problem. We will dive deep into various technical components: how to represent knowledge, how to feed knowledge into a generation model, how to evaluate generation results, and what are the remaining challenges? %R 10.18653/v1/2021.emnlp-tutorials.3 %U https://aclanthology.org/2021.emnlp-tutorials.3 %U https://doi.org/10.18653/v1/2021.emnlp-tutorials.3 %P 11-16