An Encoder with non-Sequential Dependency for Neural Data-to-Text Generation

Feng Nie, Jinpeng Wang, Rong Pan, Chin-Yew Lin


Abstract
Data-to-text generation aims to generate descriptions given a structured input data (i.e., a table with multiple records). Existing neural methods for encoding input data can be divided into two categories: a) pooling based encoders which ignore dependencies between input records or b) recurrent encoders which model only sequential dependencies between input records. In our investigation, although the recurrent encoder generally outperforms the pooling based encoder by learning the sequential dependencies, it is sensitive to the order of the input records (i.e., performance decreases when injecting the random shuffling noise over input data). To overcome this problem, we propose to adopt the self-attention mechanism to learn dependencies between arbitrary input records. Experimental results show the proposed method achieves comparable results and remains stable under random shuffling over input data.
Anthology ID:
W19-8619
Volume:
Proceedings of the 12th International Conference on Natural Language Generation
Month:
October–November
Year:
2019
Address:
Tokyo, Japan
Venues:
INLG | WS
SIG:
SIGGEN
Publisher:
Association for Computational Linguistics
Note:
Pages:
141–146
Language:
URL:
https://aclanthology.org/W19-8619
DOI:
10.18653/v1/W19-8619
Bibkey:
Cite (ACL):
Feng Nie, Jinpeng Wang, Rong Pan, and Chin-Yew Lin. 2019. An Encoder with non-Sequential Dependency for Neural Data-to-Text Generation. In Proceedings of the 12th International Conference on Natural Language Generation, pages 141–146, Tokyo, Japan. Association for Computational Linguistics.
Cite (Informal):
An Encoder with non-Sequential Dependency for Neural Data-to-Text Generation (Nie et al., 2019)
Copy Citation:
PDF:
https://aclanthology.org/W19-8619.pdf