Unsupervised KB-to-Text Generation with Auxiliary Triple Extraction using Dual Learning

Zihao Fu, Bei Shi, Lidong Bing, Wai Lam


Abstract
KB-to-text task aims at generating texts based on the given KB triples. Traditional methods usually map KB triples to sentences via a supervised seq-to-seq model. However, existing annotated datasets are very limited and human labeling is very expensive. In this paper, we propose a method which trains the generation model in a completely unsupervised way with unaligned raw text data and KB triples. Our method exploits a novel dual training framework which leverages the inverse relationship between the KB-to-text generation task and an auxiliary triple extraction task. In our architecture, we reconstruct KB triples or texts via a closed-loop framework via linking a generator and an extractor. Therefore the loss function that accounts for the reconstruction error of KB triples and texts can be used to train the generator and extractor. To resolve the cold start problem in training, we propose a method using a pseudo data generator which generates pseudo texts and KB triples for learning an initial model. To resolve the multiple-triple problem, we design an allocated reinforcement learning component to optimize the reconstruction loss. The experimental results demonstrate that our model can outperform other unsupervised generation methods and close to the bound of supervised methods.
Anthology ID:
2020.aacl-main.29
Volume:
Proceedings of the 1st Conference of the Asia-Pacific Chapter of the Association for Computational Linguistics and the 10th International Joint Conference on Natural Language Processing
Month:
December
Year:
2020
Address:
Suzhou, China
Editors:
Kam-Fai Wong, Kevin Knight, Hua Wu
Venue:
AACL
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
258–268
Language:
URL:
https://aclanthology.org/2020.aacl-main.29
DOI:
Bibkey:
Cite (ACL):
Zihao Fu, Bei Shi, Lidong Bing, and Wai Lam. 2020. Unsupervised KB-to-Text Generation with Auxiliary Triple Extraction using Dual Learning. In Proceedings of the 1st Conference of the Asia-Pacific Chapter of the Association for Computational Linguistics and the 10th International Joint Conference on Natural Language Processing, pages 258–268, Suzhou, China. Association for Computational Linguistics.
Cite (Informal):
Unsupervised KB-to-Text Generation with Auxiliary Triple Extraction using Dual Learning (Fu et al., AACL 2020)
Copy Citation:
PDF:
https://aclanthology.org/2020.aacl-main.29.pdf
Data
DBpediaWebNLG