SLANG: New Concept Comprehension of Large Language Models

Lingrui Mei, Shenghua Liu, Yiwei Wang, Baolong Bi, Xueqi Cheng


Abstract
The dynamic nature of language, particularly evident in the realm of slang and memes on the Internet, poses serious challenges to the adaptability of Large Language Models (LLMs). Traditionally anchored to static datasets, these models often struggle to keep up with the rapid linguistic evolution characteristic of online communities. This research aims to bridge this gap by enhancing LLMs’ comprehension of the evolving new concepts on the Internet, without the high cost of continual retraining. In pursuit of this goal, we introduce SLNAG, a benchmark designed to autonomously integrate novel data and assess LLMs’ ability to comprehend emerging concepts, alongside FOCUS, an approach uses causal inference to enhance LLMs to understand new phrases and their colloquial context. Our benchmark and approach involves understanding real-world instances of linguistic shifts, serving as contextual beacons, to form more precise and contextually relevant connections between newly emerging expressions and their meanings. The empirical analysis shows that our causal inference-based approach outperforms the baseline methods in terms of precision and relevance in the comprehension of Internet slang and memes.
Anthology ID:
2024.emnlp-main.698
Volume:
Proceedings of the 2024 Conference on Empirical Methods in Natural Language Processing
Month:
November
Year:
2024
Address:
Miami, Florida, USA
Editors:
Yaser Al-Onaizan, Mohit Bansal, Yun-Nung Chen
Venue:
EMNLP
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
12558–12575
Language:
URL:
https://aclanthology.org/2024.emnlp-main.698/
DOI:
10.18653/v1/2024.emnlp-main.698
Bibkey:
Cite (ACL):
Lingrui Mei, Shenghua Liu, Yiwei Wang, Baolong Bi, and Xueqi Cheng. 2024. SLANG: New Concept Comprehension of Large Language Models. In Proceedings of the 2024 Conference on Empirical Methods in Natural Language Processing, pages 12558–12575, Miami, Florida, USA. Association for Computational Linguistics.
Cite (Informal):
SLANG: New Concept Comprehension of Large Language Models (Mei et al., EMNLP 2024)
Copy Citation:
PDF:
https://aclanthology.org/2024.emnlp-main.698.pdf