Smoothing Dialogue States for Open Conversational Machine Reading

Zhuosheng Zhang, Siru Ouyang, Hai Zhao, Masao Utiyama, Eiichiro Sumita


Abstract
Conversational machine reading (CMR) requires machines to communicate with humans through multi-turn interactions between two salient dialogue states of decision making and question generation processes. In open CMR settings, as the more realistic scenario, the retrieved background knowledge would be noisy, which results in severe challenges in the information transmission. Existing studies commonly train independent or pipeline systems for the two subtasks. However, those methods are trivial by using hard-label decisions to activate question generation, which eventually hinders the model performance. In this work, we propose an effective gating strategy by smoothing the two dialogue states in only one decoder and bridge decision making and question generation to provide a richer dialogue state reference. Experiments on the OR-ShARC dataset show the effectiveness of our method, which achieves new state-of-the-art results.
Anthology ID:
2021.emnlp-main.299
Volume:
Proceedings of the 2021 Conference on Empirical Methods in Natural Language Processing
Month:
November
Year:
2021
Address:
Online and Punta Cana, Dominican Republic
Editors:
Marie-Francine Moens, Xuanjing Huang, Lucia Specia, Scott Wen-tau Yih
Venue:
EMNLP
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
3685–3696
Language:
URL:
https://aclanthology.org/2021.emnlp-main.299
DOI:
10.18653/v1/2021.emnlp-main.299
Bibkey:
Cite (ACL):
Zhuosheng Zhang, Siru Ouyang, Hai Zhao, Masao Utiyama, and Eiichiro Sumita. 2021. Smoothing Dialogue States for Open Conversational Machine Reading. In Proceedings of the 2021 Conference on Empirical Methods in Natural Language Processing, pages 3685–3696, Online and Punta Cana, Dominican Republic. Association for Computational Linguistics.
Cite (Informal):
Smoothing Dialogue States for Open Conversational Machine Reading (Zhang et al., EMNLP 2021)
Copy Citation:
PDF:
https://aclanthology.org/2021.emnlp-main.299.pdf
Video:
 https://aclanthology.org/2021.emnlp-main.299.mp4
Code
 ozyyshr/oscar
Data
ShARC