Improving Backchannel Prediction Leveraging Sequential and Attentive Context Awareness

Yo-Han Park, Wencke Liermann, Yong-Seok Choi, Kong Joo Lee


Abstract
Backchannels, which refer to short and often affirmative or empathetic responses from a listener during a conversation, play a crucial role in effective communication. In this paper, we introduce CABP(Context-Aware Backchannel Prediction), a sequential and attentive context approach aimed at enhancing backchannel prediction performance. Additionally, CABP leverages the pretrained wav2vec model for encoding audio signal. Experimental results show that CABP performs better than context-free models, with performance improvements of 1.3% and 1.8% in Korean and English datasets, respectively. Furthermore, when utilizing the pretrained wav2vec model, CABP consistently demonstrates the best performance, achieving performance improvements of 4.4% and 3.1% in Korean and English datasets.
Anthology ID:
2024.findings-eacl.118
Volume:
Findings of the Association for Computational Linguistics: EACL 2024
Month:
March
Year:
2024
Address:
St. Julian’s, Malta
Editors:
Yvette Graham, Matthew Purver
Venue:
Findings
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
1689–1694
Language:
URL:
https://aclanthology.org/2024.findings-eacl.118
DOI:
Bibkey:
Cite (ACL):
Yo-Han Park, Wencke Liermann, Yong-Seok Choi, and Kong Joo Lee. 2024. Improving Backchannel Prediction Leveraging Sequential and Attentive Context Awareness. In Findings of the Association for Computational Linguistics: EACL 2024, pages 1689–1694, St. Julian’s, Malta. Association for Computational Linguistics.
Cite (Informal):
Improving Backchannel Prediction Leveraging Sequential and Attentive Context Awareness (Park et al., Findings 2024)
Copy Citation:
PDF:
https://aclanthology.org/2024.findings-eacl.118.pdf
Software:
 2024.findings-eacl.118.software.zip