KroneckerBERT: Significant Compression of Pre-trained Language Models Through Kronecker Decomposition and Knowledge Distillation Marzieh Tahaei author Ella Charlaix author Vahid Nia author Ali Ghodsi author Mehdi Rezagholizadeh author 2022-07 text Proceedings of the 2022 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies Marine Carpuat editor Marie-Catherine de Marneffe editor Ivan Vladimir Meza Ruiz editor Association for Computational Linguistics Seattle, United States conference publication tahaei-etal-2022-kroneckerbert 10.18653/v1/2022.naacl-main.154 https://aclanthology.org/2022.naacl-main.154/ 2022-07 2116 2127