PreQuant: A Task-agnostic Quantization Approach for Pre-trained Language Models Zhuocheng Gong author Jiahao Liu author Qifan Wang author Yang Yang author Jingang Wang author Wei Wu author Yunsen Xian author Dongyan Zhao author Rui Yan author 2023-07 text Findings of the Association for Computational Linguistics: ACL 2023 Anna Rogers editor Jordan Boyd-Graber editor Naoaki Okazaki editor Association for Computational Linguistics Toronto, Canada conference publication gong-etal-2023-prequant 10.18653/v1/2023.findings-acl.511 https://aclanthology.org/2023.findings-acl.511/ 2023-07 8065 8079