Zero-Shot Cross-Lingual Sequence Tagging as Seq2Seq Generation for Joint Intent Classification and Slot Filling

Fei Wang, Kuan-hao Huang, Anoop Kumar, Aram Galstyan, Greg Ver steeg, Kai-wei Chang


Abstract
The joint intent classification and slot filling task seeks to detect the intent of an utterance and extract its semantic concepts. In the zero-shot cross-lingual setting, a model is trained on a source language and then transferred to other target languages through multi-lingual representations without additional training data. While prior studies show that pre-trained multilingual sequence-to-sequence (Seq2Seq) models can facilitate zero-shot transfer, there is little understanding on how to design the output template for the joint prediction tasks. In this paper, we examine three aspects of the output template – (1) label mapping, (2) task dependency, and (3) word order. Experiments on the MASSIVE dataset consisting of 51 languages show that our output template significantly improves the performance of pre-trained cross-lingual language models.
Anthology ID:
2022.mmnlu-1.6
Volume:
Proceedings of the Massively Multilingual Natural Language Understanding Workshop (MMNLU-22)
Month:
December
Year:
2022
Address:
Abu Dhabi, United Arab Emirates (Hybrid)
Editors:
Jack FitzGerald, Kay Rottmann, Julia Hirschberg, Mohit Bansal, Anna Rumshisky, Charith Peris, Christopher Hench
Venue:
MMNLU
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
53–61
Language:
URL:
https://aclanthology.org/2022.mmnlu-1.6
DOI:
10.18653/v1/2022.mmnlu-1.6
Bibkey:
Cite (ACL):
Fei Wang, Kuan-hao Huang, Anoop Kumar, Aram Galstyan, Greg Ver steeg, and Kai-wei Chang. 2022. Zero-Shot Cross-Lingual Sequence Tagging as Seq2Seq Generation for Joint Intent Classification and Slot Filling. In Proceedings of the Massively Multilingual Natural Language Understanding Workshop (MMNLU-22), pages 53–61, Abu Dhabi, United Arab Emirates (Hybrid). Association for Computational Linguistics.
Cite (Informal):
Zero-Shot Cross-Lingual Sequence Tagging as Seq2Seq Generation for Joint Intent Classification and Slot Filling (Wang et al., MMNLU 2022)
Copy Citation:
PDF:
https://aclanthology.org/2022.mmnlu-1.6.pdf
Video:
 https://aclanthology.org/2022.mmnlu-1.6.mp4