Infusing Context and Knowledge Awareness in Multi-turn Dialog Understanding
Ting-Wei Wu | Biing-Hwang Juang
Findings of the Association for Computational Linguistics: EACL 2023
In multi-turn dialog understanding, semantic frames are constructed by detecting intents and slots within each user utterance.However, recent works lack the capability of modeling multi-turn dynamics within a dialog in natural language understanding (NLU), instead leaving them for updating dialog states only.Moreover, humans usually associate relevant background knowledge with the current dialog contexts to better illustrate slot semantics revealed from word connotations, where previous works have explored such possibility mostly in knowledge-grounded response generation.In this paper, we propose to amend the research gap by equipping a BERT-based NLU framework with knowledge and context awareness.We first encode dialog contexts with a unidirectional context-aware transformer encoder and select relevant inter-word knowledge with the current word and previous history based on a knowledge attention mechanism. Experimental results in two complicated multi-turn dialog datasets have demonstrated significant improvements of our proposed framework. Attention visualization also demonstrates how our modules leverage knowledge across the utterance.
A Survey on Automatic Speech Recognition with an Illustrative Example on Continuous Speech Recognition of Mandarin
Chin-Hui Lee | Biing-Hwang Juang
International Journal of Computational Linguistics & Chinese Language Processing, Volume 1, Number 1, August 1996