%0 Conference Proceedings %T Generic Mechanism for Reducing Repetitions in Encoder-Decoder Models %A Zhang, Ying %A Kamigaito, Hidetaka %A Aoki, Tatsuya %A Takamura, Hiroya %A Okumura, Manabu %Y Mitkov, Ruslan %Y Angelova, Galia %S Proceedings of the International Conference on Recent Advances in Natural Language Processing (RANLP 2021) %D 2021 %8 September %I INCOMA Ltd. %C Held Online %F zhang-etal-2021-generic %X Encoder-decoder models have been commonly used for many tasks such as machine translation and response generation. As previous research reported, these models suffer from generating redundant repetition. In this research, we propose a new mechanism for encoder-decoder models that estimates the semantic difference of a source sentence before and after being fed into the encoder-decoder model to capture the consistency between two sides. This mechanism helps reduce repeatedly generated tokens for a variety of tasks. Evaluation results on publicly available machine translation and response generation datasets demonstrate the effectiveness of our proposal. %U https://aclanthology.org/2021.ranlp-1.180 %P 1606-1615