Few-Shot Tabular Data Enrichment Using Fine-Tuned Transformer Architectures

Asaf Harari, Gilad Katz


Abstract
The enrichment of tabular datasets using external sources has gained significant attention in recent years. Existing solutions, however, either ignore external unstructured data completely or devise dataset-specific solutions. In this study we proposed Few-Shot Transformer based Enrichment (FeSTE), a generic and robust framework for the enrichment of tabular datasets using unstructured data. By training over multiple datasets, our approach is able to develop generic models that can be applied to additional datasets with minimal training (i.e., few-shot). Our approach is based on an adaptation of BERT, for which we present a novel fine-tuning approach that reformulates the tuples of the datasets as sentences. Our evaluation, conducted on 17 datasets, shows that FeSTE is able to generate high quality features and significantly outperform existing fine-tuning solutions.
Anthology ID:
2022.acl-long.111
Volume:
Proceedings of the 60th Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers)
Month:
May
Year:
2022
Address:
Dublin, Ireland
Editors:
Smaranda Muresan, Preslav Nakov, Aline Villavicencio
Venue:
ACL
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
1577–1591
Language:
URL:
https://aclanthology.org/2022.acl-long.111
DOI:
10.18653/v1/2022.acl-long.111
Bibkey:
Cite (ACL):
Asaf Harari and Gilad Katz. 2022. Few-Shot Tabular Data Enrichment Using Fine-Tuned Transformer Architectures. In Proceedings of the 60th Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers), pages 1577–1591, Dublin, Ireland. Association for Computational Linguistics.
Cite (Informal):
Few-Shot Tabular Data Enrichment Using Fine-Tuned Transformer Architectures (Harari & Katz, ACL 2022)
Copy Citation:
PDF:
https://aclanthology.org/2022.acl-long.111.pdf
Software:
 2022.acl-long.111.software.zip
Video:
 https://aclanthology.org/2022.acl-long.111.mp4