Dispatcher: A Message-Passing Approach to Language Modelling

Alberto Cetoli


Abstract
This paper proposes a message-passing mechanism to address language modelling. A new layer type is introduced that aims to substitute self-attention for unidirectional sequence generation tasks. The system is shown to be competitive with existing methods: Given N tokens, the computational complexity is O(N logN) and the memory complexity is O(N) under reasonable assumptions. In the end, the Dispatcher layer is seen to achieve comparable perplexity to self-attention while being more efficient.
Anthology ID:
2022.clasp-1.3
Volume:
Proceedings of the 2022 CLASP Conference on (Dis)embodiment
Month:
September
Year:
2022
Address:
Gothenburg, Sweden
Editors:
Simon Dobnik, Julian Grove, Asad Sayeed
Venue:
CLASP
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
24–29
Language:
URL:
https://aclanthology.org/2022.clasp-1.3
DOI:
Bibkey:
Cite (ACL):
Alberto Cetoli. 2022. Dispatcher: A Message-Passing Approach to Language Modelling. In Proceedings of the 2022 CLASP Conference on (Dis)embodiment, pages 24–29, Gothenburg, Sweden. Association for Computational Linguistics.
Cite (Informal):
Dispatcher: A Message-Passing Approach to Language Modelling (Cetoli, CLASP 2022)
Copy Citation:
PDF:
https://aclanthology.org/2022.clasp-1.3.pdf
Code
 fractalego/dispatcher
Data
WebTextWikiText-103WikiText-2