LittleBird: Efficient Faster & Longer Transformer for Question Answering

Minchul Lee, Kijong Han, Myeong Cheol Shin


Abstract
BERT has shown a lot of sucess in a wide variety of NLP tasks. But it has a limitation dealing with long inputs due to its attention mechanism. Longformer, ETC and BigBird addressed this issue and effectively solved the quadratic dependency problem. However we find that these models are not sufficient, and propose LittleBird, a novel model based on BigBird with improved speed and memory footprint while maintaining accuracy. In particular, we devise a more flexible and efficient position representation method based on Attention with Linear Biases(ALiBi). We also show that replacing the method of global information represented in the BigBird with pack and unpack attention is more effective. The proposed model can work on long inputs even after being pre-trained on short inputs, and can be trained efficiently reusing existing pre-trained language model for short inputs. This is a significant benefit for low-resource languages where large amounts of long text data are difficult to obtain. As a result, our experiments show that LittleBird works very well in a variety of languages, achieving high performance in question answering tasks, particularly in KorQuAD2.0, Korean Question Answering Dataset for long paragraphs.
Anthology ID:
2022.emnlp-main.352
Volume:
Proceedings of the 2022 Conference on Empirical Methods in Natural Language Processing
Month:
December
Year:
2022
Address:
Abu Dhabi, United Arab Emirates
Editors:
Yoav Goldberg, Zornitsa Kozareva, Yue Zhang
Venue:
EMNLP
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
5261–5277
Language:
URL:
https://aclanthology.org/2022.emnlp-main.352
DOI:
10.18653/v1/2022.emnlp-main.352
Bibkey:
Cite (ACL):
Minchul Lee, Kijong Han, and Myeong Cheol Shin. 2022. LittleBird: Efficient Faster & Longer Transformer for Question Answering. In Proceedings of the 2022 Conference on Empirical Methods in Natural Language Processing, pages 5261–5277, Abu Dhabi, United Arab Emirates. Association for Computational Linguistics.
Cite (Informal):
LittleBird: Efficient Faster & Longer Transformer for Question Answering (Lee et al., EMNLP 2022)
Copy Citation:
PDF:
https://aclanthology.org/2022.emnlp-main.352.pdf