Elastic Weight Removal for Faithful and Abstractive Dialogue Generation

Nico Daheim, Nouha Dziri, Mrinmaya Sachan, Iryna Gurevych, Edoardo Ponti


Abstract
Generating factual responses is a crucial requirement for dialogue systems. To promotemore factual responses, a common strategyis to ground their responses in relevant documents that inform response generation. However, common dialogue models still often hallucinate information that was not containedin these documents and is therefore unfaithful. In this work, we propose to alleviate suchhallucinations by ‘subtracting’ the parametersof a model trained to hallucinate from a dialogue response generation model in order to‘negate’ the contribution of such hallucinatedexamples from it. Extensive automatic and human evaluation shows favourable results whencompared to state-of-the-art methods that combine the distributions of multiple models, suchas DExperts (Liu et al., 2021), and others thatchange the training procedure, such as Quark(Lu et al., 2022a). Finally, we show how wecan not only reduce hallucinations but also discourage extractive responses, which are oftena consequence of reducing hallucinations byencouraging copy-pasting of document spans.We publicly release our code for reproducibilityand facilitating further research.
Anthology ID:
2024.naacl-long.393
Volume:
Proceedings of the 2024 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies (Volume 1: Long Papers)
Month:
June
Year:
2024
Address:
Mexico City, Mexico
Editors:
Kevin Duh, Helena Gomez, Steven Bethard
Venue:
NAACL
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
7089–7105
Language:
URL:
https://aclanthology.org/2024.naacl-long.393
DOI:
Bibkey:
Cite (ACL):
Nico Daheim, Nouha Dziri, Mrinmaya Sachan, Iryna Gurevych, and Edoardo Ponti. 2024. Elastic Weight Removal for Faithful and Abstractive Dialogue Generation. In Proceedings of the 2024 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies (Volume 1: Long Papers), pages 7089–7105, Mexico City, Mexico. Association for Computational Linguistics.
Cite (Informal):
Elastic Weight Removal for Faithful and Abstractive Dialogue Generation (Daheim et al., NAACL 2024)
Copy Citation:
PDF:
https://aclanthology.org/2024.naacl-long.393.pdf
Copyright:
 2024.naacl-long.393.copyright.pdf