Why and when should you pool? Analyzing Pooling in Recurrent Architectures

Pratyush Maini, Keshav Kolluru, Danish Pruthi, Mausam


Abstract
Pooling-based recurrent neural architectures consistently outperform their counterparts without pooling on sequence classification tasks. However, the reasons for their enhanced performance are largely unexamined. In this work, we examine three commonly used pooling techniques (mean-pooling, max-pooling, and attention, and propose *max-attention*, a novel variant that captures interactions among predictive tokens in a sentence. Using novel experiments, we demonstrate that pooling architectures substantially differ from their non-pooling equivalents in their learning ability and positional biases: (i) pooling facilitates better gradient flow than BiLSTMs in initial training epochs, and (ii) BiLSTMs are biased towards tokens at the beginning and end of the input, whereas pooling alleviates this bias. Consequently, we find that pooling yields large gains in low resource scenarios, and instances when salient words lie towards the middle of the input. Across several text classification tasks, we find max-attention to frequently outperform other pooling techniques.
Anthology ID:
2020.findings-emnlp.410
Volume:
Findings of the Association for Computational Linguistics: EMNLP 2020
Month:
November
Year:
2020
Address:
Online
Editors:
Trevor Cohn, Yulan He, Yang Liu
Venue:
Findings
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
4568–4586
Language:
URL:
https://aclanthology.org/2020.findings-emnlp.410
DOI:
10.18653/v1/2020.findings-emnlp.410
Bibkey:
Cite (ACL):
Pratyush Maini, Keshav Kolluru, Danish Pruthi, and Mausam. 2020. Why and when should you pool? Analyzing Pooling in Recurrent Architectures. In Findings of the Association for Computational Linguistics: EMNLP 2020, pages 4568–4586, Online. Association for Computational Linguistics.
Cite (Informal):
Why and when should you pool? Analyzing Pooling in Recurrent Architectures (Maini et al., Findings 2020)
Copy Citation:
PDF:
https://aclanthology.org/2020.findings-emnlp.410.pdf
Code
 dair-iitd/PoolingAnalysis
Data
IMDb Movie ReviewsYahoo! Answers