Meaning Representations for Natural Languages: Design, Models and Applications

Jeffrey Flanigan, Ishan Jindal, Yunyao Li, Tim O’Gorman, Martha Palmer, Nianwen Xue


Abstract
This tutorial reviews the design of common meaning representations, SoTA models for predicting meaning representations, and the applications of meaning representations in a wide range of downstream NLP tasks and real-world applications. Reporting by a diverse team of NLP researchers from academia and industry with extensive experience in designing, building and using meaning representations, our tutorial has three components: (1) an introduction to common meaning representations, including basic concepts and design challenges; (2) a review of SoTA methods on building models for meaning representations; and (3) an overview of applications of meaning representations in downstream NLP tasks and real-world applications. We will also present qualitative comparisons of common meaning representations and a quantitative study on how their differences impact model performance. Finally, we will share best practices in choosing the right meaning representation for downstream tasks.
Anthology ID:
2022.emnlp-tutorials.1
Volume:
Proceedings of the 2022 Conference on Empirical Methods in Natural Language Processing: Tutorial Abstracts
Month:
December
Year:
2022
Address:
Abu Dubai, UAE
Editors:
Samhaa R. El-Beltagy, Xipeng Qiu
Venue:
EMNLP
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
1–8
Language:
URL:
https://aclanthology.org/2022.emnlp-tutorials.1
DOI:
10.18653/v1/2022.emnlp-tutorials.1
Bibkey:
Cite (ACL):
Jeffrey Flanigan, Ishan Jindal, Yunyao Li, Tim O’Gorman, Martha Palmer, and Nianwen Xue. 2022. Meaning Representations for Natural Languages: Design, Models and Applications. In Proceedings of the 2022 Conference on Empirical Methods in Natural Language Processing: Tutorial Abstracts, pages 1–8, Abu Dubai, UAE. Association for Computational Linguistics.
Cite (Informal):
Meaning Representations for Natural Languages: Design, Models and Applications (Flanigan et al., EMNLP 2022)
Copy Citation:
PDF:
https://aclanthology.org/2022.emnlp-tutorials.1.pdf