Permutation Invariant Strategy Using Transformer Encoders for Table Understanding

Sarthak Dash, Sugato Bagchi, Nandana Mihindukulasooriya, Alfio Gliozzo


Abstract
Representing text in tables is essential for many business intelligence tasks such as semantic retrieval, data exploration and visualization, and question answering. Existing methods that leverage pretrained Transformer encoders range from a simple construction of pseudo-sentences by concatenating text across rows or columns to complex parameter-intensive models that encode table structure and require additional pretraining. In this work, we introduce a novel encoding strategy for Transformer encoders that preserves the critical property of permutation invariance across rows or columns. Unlike existing state-of-the-art methods for Table Understanding, our proposed approach does not require any additional pretraining and still substantially outperforms existing methods in almost all instances. We demonstrate the effectiveness of our proposed approach on three table interpretation tasks: column type annotation, relation extraction, and entity linking through extensive experiments on existing tabular datasets.
Anthology ID:
2022.findings-naacl.59
Volume:
Findings of the Association for Computational Linguistics: NAACL 2022
Month:
July
Year:
2022
Address:
Seattle, United States
Editors:
Marine Carpuat, Marie-Catherine de Marneffe, Ivan Vladimir Meza Ruiz
Venue:
Findings
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
788–800
Language:
URL:
https://aclanthology.org/2022.findings-naacl.59
DOI:
10.18653/v1/2022.findings-naacl.59
Bibkey:
Cite (ACL):
Sarthak Dash, Sugato Bagchi, Nandana Mihindukulasooriya, and Alfio Gliozzo. 2022. Permutation Invariant Strategy Using Transformer Encoders for Table Understanding. In Findings of the Association for Computational Linguistics: NAACL 2022, pages 788–800, Seattle, United States. Association for Computational Linguistics.
Cite (Informal):
Permutation Invariant Strategy Using Transformer Encoders for Table Understanding (Dash et al., Findings 2022)
Copy Citation:
PDF:
https://aclanthology.org/2022.findings-naacl.59.pdf
Video:
 https://aclanthology.org/2022.findings-naacl.59.mp4