Penetrative AI: Making LLMs Comprehend the Physical World

Huatao Xu, Liying Han, Qirui Yang, Mo Li, Mani Srivastava


Abstract
Recent developments in Large Language Models (LLMs) have demonstrated their remarkable capabilities across a range of tasks. Questions, however, persist about the nature of LLMs and their potential to integrate common-sense human knowledge when performing tasks involving information about the real physical world. This paper delves into these questions by exploring how LLMs can be extended to interact with and reason about the physical world through IoT sensors and actuators, a concept that we term “Penetrative AI”. The paper explores such an extension at two levels of LLMs’ ability to penetrate into the physical world via the processing of sensory signals. Our preliminary findings indicate that LLMs, with ChatGPT being the representative example in our exploration, have considerable and unique proficiency in employing the embedded world knowledge for interpreting IoT sensor data and reasoning over them about tasks in the physical realm. Not only this opens up new applications for LLMs beyond traditional text-based tasks, but also enables new ways of incorporating human knowledge in cyber-physical systems.
Anthology ID:
2024.findings-acl.437
Volume:
Findings of the Association for Computational Linguistics ACL 2024
Month:
August
Year:
2024
Address:
Bangkok, Thailand and virtual meeting
Editors:
Lun-Wei Ku, Andre Martins, Vivek Srikumar
Venue:
Findings
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
7324–7341
Language:
URL:
https://aclanthology.org/2024.findings-acl.437
DOI:
Bibkey:
Cite (ACL):
Huatao Xu, Liying Han, Qirui Yang, Mo Li, and Mani Srivastava. 2024. Penetrative AI: Making LLMs Comprehend the Physical World. In Findings of the Association for Computational Linguistics ACL 2024, pages 7324–7341, Bangkok, Thailand and virtual meeting. Association for Computational Linguistics.
Cite (Informal):
Penetrative AI: Making LLMs Comprehend the Physical World (Xu et al., Findings 2024)
Copy Citation:
PDF:
https://aclanthology.org/2024.findings-acl.437.pdf