Table-based Fact Verification with Self-adaptive Mixture of Experts

Yuxuan Zhou, Xien Liu, Kaiyin Zhou, Ji Wu


Abstract
The table-based fact verification task has recently gained widespread attention and yet remains to be a very challenging problem. It inherently requires informative reasoning over natural language together with different numerical and logical reasoning on tables (e.g., count, superlative, comparative). Considering that, we exploit mixture-of-experts and present in this paper a new method: Self-adaptive Mixture-of-Experts Network (SaMoE). Specifically, we have developed a mixture-of-experts neural network to recognize and execute different types of reasoning—the network is composed of multiple experts, each handling a specific part of the semantics for reasoning, whereas a management module is applied to decide the contribution of each expert network to the verification result. A self-adaptive method is developed to teach the management module combining results of different experts more efficiently without external knowledge. The experimental results illustrate that our framework achieves 85.1% accuracy on the benchmark dataset TabFact, comparable with the previous state-of-the-art models. We hope our framework can serve as a new baseline for table-based verification. Our code is available at https://github.com/THUMLP/SaMoE.
Anthology ID:
2022.findings-acl.13
Volume:
Findings of the Association for Computational Linguistics: ACL 2022
Month:
May
Year:
2022
Address:
Dublin, Ireland
Venue:
Findings
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
139–149
Language:
URL:
https://aclanthology.org/2022.findings-acl.13
DOI:
10.18653/v1/2022.findings-acl.13
Bibkey:
Cite (ACL):
Yuxuan Zhou, Xien Liu, Kaiyin Zhou, and Ji Wu. 2022. Table-based Fact Verification with Self-adaptive Mixture of Experts. In Findings of the Association for Computational Linguistics: ACL 2022, pages 139–149, Dublin, Ireland. Association for Computational Linguistics.
Cite (Informal):
Table-based Fact Verification with Self-adaptive Mixture of Experts (Zhou et al., Findings 2022)
Copy Citation:
PDF:
https://aclanthology.org/2022.findings-acl.13.pdf
Code
 thumlp/samoe
Data
TabFact