@inproceedings{thompson-etal-2019-overcoming,
    title = "Overcoming Catastrophic Forgetting During Domain Adaptation of Neural Machine Translation",
    author = "Thompson, Brian  and
      Gwinnup, Jeremy  and
      Khayrallah, Huda  and
      Duh, Kevin  and
      Koehn, Philipp",
    booktitle = "Proceedings of the 2019 Conference of the North {A}merican Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers)",
    month = jun,
    year = "2019",
    address = "Minneapolis, Minnesota",
    publisher = "Association for Computational Linguistics",
    url = "https://aclanthology.org/N19-1209",
    doi = "10.18653/v1/N19-1209",
    pages = "2062--2068",
    abstract = "Continued training is an effective method for domain adaptation in neural machine translation. However, in-domain gains from adaptation come at the expense of general-domain performance. In this work, we interpret the drop in general-domain performance as catastrophic forgetting of general-domain knowledge. To mitigate it, we adapt Elastic Weight Consolidation (EWC){---}a machine learning method for learning a new task without forgetting previous tasks. Our method retains the majority of general-domain performance lost in continued training without degrading in-domain performance, outperforming the previous state-of-the-art. We also explore the full range of general-domain performance available when some in-domain degradation is acceptable.",
}
