MixCE: Training Autoregressive Language Models by Mixing Forward and Reverse Cross-Entropies Shiyue Zhang author Shijie Wu author Ozan Irsoy author Steven Lu author Mohit Bansal author Mark Dredze author David Rosenberg author 2023-07 text Proceedings of the 61st Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers) Anna Rogers editor Jordan Boyd-Graber editor Naoaki Okazaki editor Association for Computational Linguistics Toronto, Canada conference publication zhang-etal-2023-mixce 10.18653/v1/2023.acl-long.502 https://aclanthology.org/2023.acl-long.502/ 2023-07 9027 9050