A modular architecture for creating multimodal embodied agents with an episodic Knowledge Graph as an explainable and controllable long-term memory

Thomas Baier, Selene Báez Santamaría, Piek Vossen


Abstract
How can flexibility and control over the interpretation of multimodal signals by embodied agents be balanced? Flexibility means that agents respond fluently in any context, whereas control means that responses are transparent and faithful to goals and principles that are explicitly defined. This paper describes a modular platform to create multimodal interactive agents using an event bus on which signals and interpretations are posted as a sequence in time, but also provides control options to drive the interaction given specific intentions and goals. Different sensors and interpretation components can be integrated by defining their input and output topics in the event bus, which results in an open multimodal sequence-driven workflow for further interpretations. In addition, our platform allows us to define higher-level intents that control sequence patterns to achieve a goal. A key component is an episodic Knowledge Graph (eKG) that acts as a long-term symbolic memory to aggregate and connect these interpretations. This eKG establishes coherence and continuity across different interactions. Intents and the eKG make it possible to define different (embodied) agents and compare their behavior without having to implement complex software components for multimodal sensor data and design the control over their dependencies. In this paper, we explain the broad range of components that we developed and integrated into various interactive agents. We also explain how the interaction is recorded as multimodal data and how it results in an aggregated memory in the eKG. By analyzing the recorded interaction, we can compare agents and agent components and study their interactive behavior with people and other agents.
Anthology ID:
2025.dnd-16.11
Volume:
Dialogue Discourse Volume 16
Month:
December
Year:
2025
Address:
Chicago, Illinois, USA
Editors:
Amir Zeldes, Manfred Stede, Patrick G.T. Healey, and Hendrik Buschmeier
Venue:
DND
SIG:
SIGDIAL
Publisher:
University of Illinois Chicago
Note:
Pages:
25–59
Language:
URL:
https://aclanthology.org/2025.dnd-16.11/
DOI:
10.5210/dad.2025.303
Bibkey:
Cite (ACL):
Thomas Baier, Selene Báez Santamaría, and Piek Vossen. 2025. A modular architecture for creating multimodal embodied agents with an episodic Knowledge Graph as an explainable and controllable long-term memory. Dialogue & Discourse, 16:25–59.
Cite (Informal):
A modular architecture for creating multimodal embodied agents with an episodic Knowledge Graph as an explainable and controllable long-term memory (Baier et al., DND 2025)
Copy Citation:
PDF:
https://aclanthology.org/2025.dnd-16.11.pdf