Towards Understanding the Relation between Gestures and Language

Artem Abzaliev, Andrew Owens, Rada Mihalcea


Abstract
In this paper, we explore the relation between gestures and language. Using a multimodal dataset, consisting of Ted talks where the language is aligned with the gestures made by the speakers, we adapt a semi-supervised multimodal model to learn gesture embeddings. We show that gestures are predictive of the native language of the speaker, and that gesture embeddings further improve language prediction result. In addition, gesture embeddings might contain some linguistic information, as we show by probing embeddings for psycholinguistic categories. Finally, we analyze the words that lead to the most expressive gestures and find that function words drive the expressiveness of gestures.
Anthology ID:
2022.coling-1.488
Volume:
Proceedings of the 29th International Conference on Computational Linguistics
Month:
October
Year:
2022
Address:
Gyeongju, Republic of Korea
Editors:
Nicoletta Calzolari, Chu-Ren Huang, Hansaem Kim, James Pustejovsky, Leo Wanner, Key-Sun Choi, Pum-Mo Ryu, Hsin-Hsi Chen, Lucia Donatelli, Heng Ji, Sadao Kurohashi, Patrizia Paggio, Nianwen Xue, Seokhwan Kim, Younggyun Hahm, Zhong He, Tony Kyungil Lee, Enrico Santus, Francis Bond, Seung-Hoon Na
Venue:
COLING
SIG:
Publisher:
International Committee on Computational Linguistics
Note:
Pages:
5507–5520
Language:
URL:
https://aclanthology.org/2022.coling-1.488
DOI:
Bibkey:
Cite (ACL):
Artem Abzaliev, Andrew Owens, and Rada Mihalcea. 2022. Towards Understanding the Relation between Gestures and Language. In Proceedings of the 29th International Conference on Computational Linguistics, pages 5507–5520, Gyeongju, Republic of Korea. International Committee on Computational Linguistics.
Cite (Informal):
Towards Understanding the Relation between Gestures and Language (Abzaliev et al., COLING 2022)
Copy Citation:
PDF:
https://aclanthology.org/2022.coling-1.488.pdf