Prefixing Attention Sinks can Mitigate Activation Outliers for Large Language Model Quantization Seungwoo Son author Wonpyo Park author Woohyun Han author Kyuyeun Kim author Jaeho Lee author 2024-11 text Proceedings of the 2024 Conference on Empirical Methods in Natural Language Processing Yaser Al-Onaizan editor Mohit Bansal editor Yun-Nung Chen editor Association for Computational Linguistics Miami, Florida, USA conference publication son-etal-2024-prefixing 10.18653/v1/2024.emnlp-main.134 https://aclanthology.org/2024.emnlp-main.134/ 2024-11 2242 2252