Multimodal large language models for inclusive collaboration learning tasks

Armanda Lewis


Abstract
This PhD project leverages advancements in multimodal large language models to build an inclusive collaboration feedback loop, in order to facilitate the automated detection, modeling, and feedback for participants developing general collaboration skills. This topic is important given the role of collaboration as an essential 21st century skill, the potential to ground large language models within learning theory and real-world practice, and the expressive potential of transformer models to support equity and inclusion. We address some concerns of integrating advances in natural language processing into downstream tasks such as the learning analytics feedback loop.
Anthology ID:
2022.naacl-srw.26
Volume:
Proceedings of the 2022 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies: Student Research Workshop
Month:
July
Year:
2022
Address:
Hybrid: Seattle, Washington + Online
Editors:
Daphne Ippolito, Liunian Harold Li, Maria Leonor Pacheco, Danqi Chen, Nianwen Xue
Venue:
NAACL
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
202–210
Language:
URL:
https://aclanthology.org/2022.naacl-srw.26
DOI:
10.18653/v1/2022.naacl-srw.26
Bibkey:
Cite (ACL):
Armanda Lewis. 2022. Multimodal large language models for inclusive collaboration learning tasks. In Proceedings of the 2022 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies: Student Research Workshop, pages 202–210, Hybrid: Seattle, Washington + Online. Association for Computational Linguistics.
Cite (Informal):
Multimodal large language models for inclusive collaboration learning tasks (Lewis, NAACL 2022)
Copy Citation:
PDF:
https://aclanthology.org/2022.naacl-srw.26.pdf
Video:
 https://aclanthology.org/2022.naacl-srw.26.mp4