How Do Large Language Models Perform on PDE Discovery: A Coarse-to-fine Perspective

Xiao Luo, Changhu Wang, Yizhou Sun, Wei Wang


Abstract
This paper studies the problem of how to use large language models (LLMs) to identify the underlying partial differential equations (PDEs) out of very limited observations of a physical system. Previous methods usually utilize physical-informed neural networks (PINNs) to learn the PDE solver and coefficient of PDEs simultaneously, which could suffer from performance degradation under extreme data scarcity. Towards this end, this paper attempts to utilize LLMs to solve this problem without further fine-tuning by proposing a novel framework named LLM for PDE Discovery (LLM4PD). The core of our LLM4PD is to utilize a coarse-to-fine paradigm to automatically discover underlying PDEs. In the coarse phase, LLM4PD selects the crucial terms from a library with hierarchical prompts and incorporates a review agent to enhance the accuracy. In the fine phase, LLM4PD interacts with a PDE solver to optimize the coefficient of the selected terms with the optimization trajectory. We also provide an adaptive hybrid optimization strategy switching between fine-tuning and exploration to balance stability and efficiency. Extensive experiments on several systems validate the effectiveness of our proposed LLM4PD in different settings.
Anthology ID:
2025.findings-emnlp.145
Volume:
Findings of the Association for Computational Linguistics: EMNLP 2025
Month:
November
Year:
2025
Address:
Suzhou, China
Editors:
Christos Christodoulopoulos, Tanmoy Chakraborty, Carolyn Rose, Violet Peng
Venue:
Findings
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
2684–2697
Language:
URL:
https://aclanthology.org/2025.findings-emnlp.145/
DOI:
Bibkey:
Cite (ACL):
Xiao Luo, Changhu Wang, Yizhou Sun, and Wei Wang. 2025. How Do Large Language Models Perform on PDE Discovery: A Coarse-to-fine Perspective. In Findings of the Association for Computational Linguistics: EMNLP 2025, pages 2684–2697, Suzhou, China. Association for Computational Linguistics.
Cite (Informal):
How Do Large Language Models Perform on PDE Discovery: A Coarse-to-fine Perspective (Luo et al., Findings 2025)
Copy Citation:
PDF:
https://aclanthology.org/2025.findings-emnlp.145.pdf
Checklist:
 2025.findings-emnlp.145.checklist.pdf