Effect of Post-processing on Contextualized Word Representations

Hassan Sajjad, Firoj Alam, Fahim Dalvi, Nadir Durrani


Abstract
Post-processing of static embedding has been shown to improve their performance on both lexical and sequence-level tasks. However, post-processing for contextualized embeddings is an under-studied problem. In this work, we question the usefulness of post-processing for contextualized embeddings obtained from different layers of pre-trained language models. More specifically, we standardize individual neuron activations using z-score, min-max normalization, and by removing top principal components using the all-but-the-top method. Additionally, we apply unit length normalization to word representations. On a diverse set of pre-trained models, we show that post-processing unwraps vital information present in the representations for both lexical tasks (such as word similarity and analogy) and sequence classification tasks. Our findings raise interesting points in relation to the research studies that use contextualized representations, and suggest z-score normalization as an essential step to consider when using them in an application.
Anthology ID:
2022.coling-1.277
Volume:
Proceedings of the 29th International Conference on Computational Linguistics
Month:
October
Year:
2022
Address:
Gyeongju, Republic of Korea
Venue:
COLING
SIG:
Publisher:
International Committee on Computational Linguistics
Note:
Pages:
3127–3142
Language:
URL:
https://aclanthology.org/2022.coling-1.277
DOI:
Bibkey:
Cite (ACL):
Hassan Sajjad, Firoj Alam, Fahim Dalvi, and Nadir Durrani. 2022. Effect of Post-processing on Contextualized Word Representations. In Proceedings of the 29th International Conference on Computational Linguistics, pages 3127–3142, Gyeongju, Republic of Korea. International Committee on Computational Linguistics.
Cite (Informal):
Effect of Post-processing on Contextualized Word Representations (Sajjad et al., COLING 2022)
Copy Citation:
PDF:
https://aclanthology.org/2022.coling-1.277.pdf
Data
GLUEQNLISST