Recursive Tree-Structured Self-Attention for Answer Sentence Selection

Khalil Mrini, Emilia Farcas, Ndapa Nakashole


Abstract
Syntactic structure is an important component of natural language text. Recent top-performing models in Answer Sentence Selection (AS2) use self-attention and transfer learning, but not syntactic structure. Tree structures have shown strong performance in tasks with sentence pair input like semantic relatedness. We investigate whether tree structures can boost performance in AS2. We introduce the Tree Aggregation Transformer: a novel recursive, tree-structured self-attention model for AS2. The recursive nature of our model is able to represent all levels of syntactic parse trees with only one additional self-attention layer. Without transfer learning, we establish a new state of the art on the popular TrecQA and WikiQA benchmark datasets. Additionally, we evaluate our method on four Community Question Answering datasets, and find that tree-structured representations have limitations with noisy user-generated text. We conduct probing experiments to evaluate how our models leverage tree structures across datasets. Our findings show that the ability of tree-structured models to successfully absorb syntactic information is strongly correlated with a higher performance in AS2.
Anthology ID:
2021.acl-long.358
Volume:
Proceedings of the 59th Annual Meeting of the Association for Computational Linguistics and the 11th International Joint Conference on Natural Language Processing (Volume 1: Long Papers)
Month:
August
Year:
2021
Address:
Online
Editors:
Chengqing Zong, Fei Xia, Wenjie Li, Roberto Navigli
Venues:
ACL | IJCNLP
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
4651–4661
Language:
URL:
https://aclanthology.org/2021.acl-long.358
DOI:
10.18653/v1/2021.acl-long.358
Bibkey:
Cite (ACL):
Khalil Mrini, Emilia Farcas, and Ndapa Nakashole. 2021. Recursive Tree-Structured Self-Attention for Answer Sentence Selection. In Proceedings of the 59th Annual Meeting of the Association for Computational Linguistics and the 11th International Joint Conference on Natural Language Processing (Volume 1: Long Papers), pages 4651–4661, Online. Association for Computational Linguistics.
Cite (Informal):
Recursive Tree-Structured Self-Attention for Answer Sentence Selection (Mrini et al., ACL-IJCNLP 2021)
Copy Citation:
PDF:
https://aclanthology.org/2021.acl-long.358.pdf
Video:
 https://aclanthology.org/2021.acl-long.358.mp4
Data
ASNQWikiQA