Contextual BERT: Conditioning the Language Model Using a Global State

Timo I. Denk, Ana Peleteiro Ramallo


Abstract
BERT is a popular language model whose main pre-training task is to fill in the blank, i.e., predicting a word that was masked out of a sentence, based on the remaining words. In some applications, however, having an additional context can help the model make the right prediction, e.g., by taking the domain or the time of writing into account. This motivates us to advance the BERT architecture by adding a global state for conditioning on a fixed-sized context. We present our two novel approaches and apply them to an industry use-case, where we complete fashion outfits with missing articles, conditioned on a specific customer. An experimental comparison to other methods from the literature shows that our methods improve personalization significantly.
Anthology ID:
2020.textgraphs-1.5
Volume:
Proceedings of the Graph-based Methods for Natural Language Processing (TextGraphs)
Month:
December
Year:
2020
Address:
Barcelona, Spain (Online)
Editors:
Dmitry Ustalov, Swapna Somasundaran, Alexander Panchenko, Fragkiskos D. Malliaros, Ioana Hulpuș, Peter Jansen, Abhik Jana
Venue:
TextGraphs
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
46–50
Language:
URL:
https://aclanthology.org/2020.textgraphs-1.5
DOI:
10.18653/v1/2020.textgraphs-1.5
Bibkey:
Cite (ACL):
Timo I. Denk and Ana Peleteiro Ramallo. 2020. Contextual BERT: Conditioning the Language Model Using a Global State. In Proceedings of the Graph-based Methods for Natural Language Processing (TextGraphs), pages 46–50, Barcelona, Spain (Online). Association for Computational Linguistics.
Cite (Informal):
Contextual BERT: Conditioning the Language Model Using a Global State (Denk & Peleteiro Ramallo, TextGraphs 2020)
Copy Citation:
PDF:
https://aclanthology.org/2020.textgraphs-1.5.pdf