Exploring the Naturalness of Cognitive Status-Informed Referring Form Selection Models

Gabriel Del Castillo, Grace Clark, Zhao Han, Tom Williams


Abstract
Language-capable robots must be able to efficiently and naturally communicate about objects in the environment. A key part of communication is Referring Form Selection (RFS): the process of selecting a form like it, that, or the N to use when referring to an object. Recent cognitive status-informed computational RFS models have been evaluated in terms of goodness-of-fit to human data. But it is as yet unclear whether these models actually select referring forms that are any more natural than baseline alternatives, regardless of goodness-of-fit. Through a human subject study designed to assess this question, we show that even though cognitive status-informed referring selection models achieve good fit to human data, they do not (yet) produce concrete benefits in terms of naturality. On the other hand, our results show that human utterances also had high variability in perceived naturality, demonstrating the challenges of evaluating RFS naturality.
Anthology ID:
2023.inlg-main.19
Volume:
Proceedings of the 16th International Natural Language Generation Conference
Month:
September
Year:
2023
Address:
Prague, Czechia
Editors:
C. Maria Keet, Hung-Yi Lee, Sina Zarrieß
Venues:
INLG | SIGDIAL
SIG:
SIGGEN
Publisher:
Association for Computational Linguistics
Note:
Pages:
269–278
Language:
URL:
https://aclanthology.org/2023.inlg-main.19
DOI:
10.18653/v1/2023.inlg-main.19
Bibkey:
Cite (ACL):
Gabriel Del Castillo, Grace Clark, Zhao Han, and Tom Williams. 2023. Exploring the Naturalness of Cognitive Status-Informed Referring Form Selection Models. In Proceedings of the 16th International Natural Language Generation Conference, pages 269–278, Prague, Czechia. Association for Computational Linguistics.
Cite (Informal):
Exploring the Naturalness of Cognitive Status-Informed Referring Form Selection Models (Del Castillo et al., INLG-SIGDIAL 2023)
Copy Citation:
PDF:
https://aclanthology.org/2023.inlg-main.19.pdf