Generative Table Pre-training Empowers Models for Tabular Prediction

Tianping Zhang, Shaowen Wang, Shuicheng Yan, Li Jian, Qian Liu


Abstract
Recently, the topic of table pre-training has attracted considerable research interest. However, how to employ table pre-training to boost the performance of tabular prediction remains an open challenge. In this paper, we propose TapTap, the first attempt that leverages table pre-training to empower models for tabular prediction. After pre-training on a large corpus of real-world tabular data, TapTap can generate high-quality synthetic tables to support various applications on tabular data, including privacy protection, low resource regime, missing value imputation, and imbalanced classification. Extensive experiments on 12 datasets demonstrate that TapTap outperforms a total of 16 baselines in different scenarios. Meanwhile, it can be easily combined with various backbone models, including LightGBM, Multilayer Perceptron (MLP) and Transformer. Moreover, with the aid of table pre-training, models trained using synthetic data generated by TapTap can even compete with models using the original dataset on half of the experimental datasets, marking a milestone in the development of synthetic tabular data generation. The code and datasets are available at https://github.com/ZhangTP1996/TapTap.
Anthology ID:
2023.emnlp-main.917
Volume:
Proceedings of the 2023 Conference on Empirical Methods in Natural Language Processing
Month:
December
Year:
2023
Address:
Singapore
Editors:
Houda Bouamor, Juan Pino, Kalika Bali
Venue:
EMNLP
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
14836–14854
Language:
URL:
https://aclanthology.org/2023.emnlp-main.917
DOI:
10.18653/v1/2023.emnlp-main.917
Bibkey:
Cite (ACL):
Tianping Zhang, Shaowen Wang, Shuicheng Yan, Li Jian, and Qian Liu. 2023. Generative Table Pre-training Empowers Models for Tabular Prediction. In Proceedings of the 2023 Conference on Empirical Methods in Natural Language Processing, pages 14836–14854, Singapore. Association for Computational Linguistics.
Cite (Informal):
Generative Table Pre-training Empowers Models for Tabular Prediction (Zhang et al., EMNLP 2023)
Copy Citation:
PDF:
https://aclanthology.org/2023.emnlp-main.917.pdf
Video:
 https://aclanthology.org/2023.emnlp-main.917.mp4