Yingtong Dou


2025

pdf bib
Enhancing Foundation Models in Transaction Understanding with LLM-based Sentence Embeddings
Xiran Fan | Zhimeng Jiang | Chin-Chia Michael Yeh | Yuzhong Chen | Yingtong Dou | Menghai Pan | Yan Zheng
Proceedings of the 2025 Conference on Empirical Methods in Natural Language Processing: Industry Track

The ubiquity of payment networks generates vast transactional data encoding rich consumer and merchant behavioral patterns. Recent foundation models for transaction analysis process tabular data sequentially but rely on index-based representations for categorical merchant fields, causing substantial semantic information loss by converting rich textual data into discrete tokens. While Large Language Models (LLMs) can address this limitation through superior semantic understanding, their computational overhead challenges real-time financial deployment. We introduce a hybrid framework that uses LLM-generated embeddings as semantic initializations for lightweight transaction models, balancing interpretability with operational efficiency. Our approach employs multi-source data fusion to enrich merchant categorical fields and a one-word constraint principle for consistent embedding generation across LLM architectures. We systematically address data quality through noise filtering and context-aware enrichment. Experiments on large-scale transaction datasets demonstrate significant performance improvements across multiple transaction understanding tasks.