%0 Conference Proceedings %T Cheat Codes to Quantify Missing Source Information in Neural Machine Translation %A Pal, Proyag %A Heafield, Kenneth %Y Carpuat, Marine %Y de Marneffe, Marie-Catherine %Y Meza Ruiz, Ivan Vladimir %S Proceedings of the 2022 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies %D 2022 %8 July %I Association for Computational Linguistics %C Seattle, United States %F pal-heafield-2022-cheat %X This paper describes a method to quantify the amount of information H(t|s) added by the target sentence t that is not present in the source s in a neural machine translation system. We do this by providing the model the target sentence in a highly compressed form (a “cheat code”), and exploring the effect of the size of the cheat code. We find that the model is able to capture extra information from just a single float representation of the target and nearly reproduces the target with two 32-bit floats per target token. %R 10.18653/v1/2022.naacl-main.177 %U https://aclanthology.org/2022.naacl-main.177 %U https://doi.org/10.18653/v1/2022.naacl-main.177 %P 2472-2477