Semantic Parsing with Dual Learning

Ruisheng Cao, Su Zhu, Chen Liu, Jieyu Li, Kai Yu


Abstract
Semantic parsing converts natural language queries into structured logical forms. The lack of training data is still one of the most serious problems in this area. In this work, we develop a semantic parsing framework with the dual learning algorithm, which enables a semantic parser to make full use of data (labeled and even unlabeled) through a dual-learning game. This game between a primal model (semantic parsing) and a dual model (logical form to query) forces them to regularize each other, and can achieve feedback signals from some prior-knowledge. By utilizing the prior-knowledge of logical form structures, we propose a novel reward signal at the surface and semantic levels which tends to generate complete and reasonable logical forms. Experimental results show that our approach achieves new state-of-the-art performance on ATIS dataset and gets competitive performance on OVERNIGHT dataset.
Anthology ID:
P19-1007
Volume:
Proceedings of the 57th Annual Meeting of the Association for Computational Linguistics
Month:
July
Year:
2019
Address:
Florence, Italy
Editors:
Anna Korhonen, David Traum, Lluís Màrquez
Venue:
ACL
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
51–64
Language:
URL:
https://aclanthology.org/P19-1007
DOI:
10.18653/v1/P19-1007
Bibkey:
Cite (ACL):
Ruisheng Cao, Su Zhu, Chen Liu, Jieyu Li, and Kai Yu. 2019. Semantic Parsing with Dual Learning. In Proceedings of the 57th Annual Meeting of the Association for Computational Linguistics, pages 51–64, Florence, Italy. Association for Computational Linguistics.
Cite (Informal):
Semantic Parsing with Dual Learning (Cao et al., ACL 2019)
Copy Citation:
PDF:
https://aclanthology.org/P19-1007.pdf
Video:
 https://aclanthology.org/P19-1007.mp4
Code
 rhythmcao/semantic-parsing-dual