%0 Conference Proceedings %T LogicalFactChecker: Leveraging Logical Operations for Fact Checking with Graph Module Network %A Zhong, Wanjun %A Tang, Duyu %A Feng, Zhangyin %A Duan, Nan %A Zhou, Ming %A Gong, Ming %A Shou, Linjun %A Jiang, Daxin %A Wang, Jiahai %A Yin, Jian %Y Jurafsky, Dan %Y Chai, Joyce %Y Schluter, Natalie %Y Tetreault, Joel %S Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics %D 2020 %8 July %I Association for Computational Linguistics %C Online %F zhong-etal-2020-logicalfactchecker %X Verifying the correctness of a textual statement requires not only semantic reasoning about the meaning of words, but also symbolic reasoning about logical operations like count, superlative, aggregation, etc. In this work, we propose LogicalFactChecker, a neural network approach capable of leveraging logical operations for fact checking. It achieves the state-of-the-art performance on TABFACT, a large-scale, benchmark dataset built for verifying a textual statement with semi-structured tables. This is achieved by a graph module network built upon the Transformer-based architecture. With a textual statement and a table as the input, LogicalFactChecker automatically derives a program (a.k.a. logical form) of the statement in a semantic parsing manner. A heterogeneous graph is then constructed to capture not only the structures of the table and the program, but also the connections between inputs with different modalities. Such a graph reveals the related contexts of each word in the statement, the table and the program. The graph is used to obtain graph-enhanced contextual representations of words in Transformer-based architecture. After that, a program-driven module network is further introduced to exploit the hierarchical structure of the program, where semantic compositionality is dynamically modeled along the program structure with a set of function-specific modules. Ablation experiments suggest that both the heterogeneous graph and the module network are important to obtain strong results. %R 10.18653/v1/2020.acl-main.539 %U https://aclanthology.org/2020.acl-main.539 %U https://doi.org/10.18653/v1/2020.acl-main.539 %P 6053-6065