Robust Learning for Multi-party Addressee Recognition with Discrete Addressee Codebook

Pengcheng Zhu, Wei Zhou, Kuncai Zhang, Yuankai Ma, Haiqing Chen


Abstract
Addressee recognition aims to identify addressees in multi-party conversations. While state-of-the-art addressee recognition models have achieved promising performance, they still suffer from the issue of robustness when applied in real-world scenes. When exposed to a noisy environment, these models regard the noise as input and identify the addressee in a pre-given addressee closed set, while the addressees of the noise do not belong to this closed set, thus leading to the wrong identification of addressee. To this end, we propose a Robust Addressee Recognition (RAR) method, which discrete the addressees into a character codebook, making it able to represent open set addressees and robust in a noisy environment. Experimental results show that the introduction of the addressee character codebook helps to represent the open set addressees and highly improves the robustness of addressee recognition even if the input is noise.
Anthology ID:
2023.acl-short.50
Volume:
Proceedings of the 61st Annual Meeting of the Association for Computational Linguistics (Volume 2: Short Papers)
Month:
July
Year:
2023
Address:
Toronto, Canada
Editors:
Anna Rogers, Jordan Boyd-Graber, Naoaki Okazaki
Venue:
ACL
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
571–578
Language:
URL:
https://aclanthology.org/2023.acl-short.50
DOI:
10.18653/v1/2023.acl-short.50
Bibkey:
Cite (ACL):
Pengcheng Zhu, Wei Zhou, Kuncai Zhang, Yuankai Ma, and Haiqing Chen. 2023. Robust Learning for Multi-party Addressee Recognition with Discrete Addressee Codebook. In Proceedings of the 61st Annual Meeting of the Association for Computational Linguistics (Volume 2: Short Papers), pages 571–578, Toronto, Canada. Association for Computational Linguistics.
Cite (Informal):
Robust Learning for Multi-party Addressee Recognition with Discrete Addressee Codebook (Zhu et al., ACL 2023)
Copy Citation:
PDF:
https://aclanthology.org/2023.acl-short.50.pdf
Video:
 https://aclanthology.org/2023.acl-short.50.mp4