Generalising Multilingual Concept-to-Text NLG with Language Agnostic Delexicalisation

Giulio Zhou, Gerasimos Lampouras


Abstract
Concept-to-text Natural Language Generation is the task of expressing an input meaning representation in natural language. Previous approaches in this task have been able to generalise to rare or unseen instances by relying on a delexicalisation of the input. However, this often requires that the input appears verbatim in the output text. This poses challenges in multilingual settings, where the task expands to generate the output text in multiple languages given the same input. In this paper, we explore the application of multilingual models in concept-to-text and propose Language Agnostic Delexicalisation, a novel delexicalisation method that uses multilingual pretrained embeddings, and employs a character-level post-editing model to inflect words in their correct form during relexicalisation. Our experiments across five datasets and five languages show that multilingual models outperform monolingual models in concept-to-text and that our framework outperforms previous approaches, especially in low resource conditions.
Anthology ID:
2021.acl-long.10
Volume:
Proceedings of the 59th Annual Meeting of the Association for Computational Linguistics and the 11th International Joint Conference on Natural Language Processing (Volume 1: Long Papers)
Month:
August
Year:
2021
Address:
Online
Editors:
Chengqing Zong, Fei Xia, Wenjie Li, Roberto Navigli
Venues:
ACL | IJCNLP
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
114–127
Language:
URL:
https://aclanthology.org/2021.acl-long.10
DOI:
10.18653/v1/2021.acl-long.10
Bibkey:
Cite (ACL):
Giulio Zhou and Gerasimos Lampouras. 2021. Generalising Multilingual Concept-to-Text NLG with Language Agnostic Delexicalisation. In Proceedings of the 59th Annual Meeting of the Association for Computational Linguistics and the 11th International Joint Conference on Natural Language Processing (Volume 1: Long Papers), pages 114–127, Online. Association for Computational Linguistics.
Cite (Informal):
Generalising Multilingual Concept-to-Text NLG with Language Agnostic Delexicalisation (Zhou & Lampouras, ACL-IJCNLP 2021)
Copy Citation:
PDF:
https://aclanthology.org/2021.acl-long.10.pdf
Video:
 https://aclanthology.org/2021.acl-long.10.mp4
Data
CrossWOZ