Shujie Li
2024
Unifying Structured Data as Graph for Data-to-Text Pre-Training
Shujie Li
|
Liang Li
|
Ruiying Geng
|
Min Yang
|
Binhua Li
|
Guanghu Yuan
|
Wanwei He
|
Shao Yuan
|
Can Ma
|
Fei Huang
|
Yongbin Li
Transactions of the Association for Computational Linguistics, Volume 12
Data-to-text (D2T) generation aims to transform structured data into natural language text. Data-to-text pre-training has proved to be powerful in enhancing D2T generation and yields impressive performance. However, previous pre-training methods either oversimplified structured data into a sequence without considering input structures or designed training objectives tailored for a specific data structure (e.g., table or knowledge graph). In this paper, we unify different types of structured data (i.e., table, key-value data, knowledge graph) into the graph format and cast different D2T generation tasks as graph-to-text generation. To effectively exploit the structural information of the input graph, we propose a structure-enhanced pre-training method for D2T generation by designing a structure-enhanced Transformer. Concretely, we devise a position matrix for the Transformer, encoding relative positional information of connected nodes in the input graph. In addition, we propose a new attention matrix to incorporate graph structures into the original Transformer by taking the available explicit connectivity structure into account. Extensive experiments on six benchmark datasets show the effectiveness of our model. Our source codes are available at https://github.com/AlibabaResearch/DAMO-ConvAI/tree/main/unid2t.
TP-Link: Fine-grained Pre-Training for Text-to-SQL Parsing with Linking Information
Ziqiang Liu
|
Shujie Li
|
Zefeng Cai
|
Xiangyu Li
|
Yunshui Li
|
Chengming Li
|
Xiping Hu
|
Ruifeng Xu
|
Min Yang
Proceedings of the 2024 Joint International Conference on Computational Linguistics, Language Resources and Evaluation (LREC-COLING 2024)
In this paper, we introduce an innovative pre-training framework TP-Link, which aims to improve context-dependent Text-to-SQL Parsing by leveraging Linking information. This enhancement is achieved through better representation of both natural language utterances and the database schema, ultimately facilitating more effective text-to-SQL conversations. We present two novel pre-training objectives: (i) utterance linking prediction (ULP) task that models intricate syntactic relationships among natural language utterances in context-dependent text-to-SQL scenarios, and (ii) schema linking prediction (SLP) task that focuses on capturing fine-grained schema linking relationships between the utterances and the database schema. Extensive experiments demonstrate that our proposed TP-Link achieves state-of-the-art performance on two leading downstream benchmarks (i.e., SParC and CoSQL).
Search
Co-authors
- Min Yang 2
- Liang Li 1
- Ruiying Geng 1
- Binhua Li 1
- Guanghu Yuan 1
- show all...