Yajing Luo
2025
AntIF:大语言模型抗干扰能力评估
Yajing Luo | Yutao Hou | Yun Chen | Guanhua Chen
Proceedings of the 24th China National Conference on Computational Linguistics (CCL 2025)
Yajing Luo | Yutao Hou | Yun Chen | Guanhua Chen
Proceedings of the 24th China National Conference on Computational Linguistics (CCL 2025)
"本文提出了一种多智能体协同的干扰数据生成框架,旨在评测分析大语言模型在复杂干扰下的鲁棒性。该框架以数学领域为起点,逐步扩展至医学、法律、科学及通用场景,构建了涵盖拼写干扰、数字干扰、类型干扰与谣言干扰四类干扰的跨领域数据集AntIF,共计近5000条数据。在此基础上,本文对主流开源语言模型进行了系统的抗干扰能力评估,并结合不同的提示工程策略与模型微调方法,深入分析了AntIF 在提升模型鲁棒性方面的实际效果。"
2023
StyleBART: Decorate Pretrained Model with Style Adapters for Unsupervised Stylistic Headline Generation
Hanqing Wang | Yajing Luo | Boya Xiong | Guanhua Chen | Yun Chen
Findings of the Association for Computational Linguistics: EMNLP 2023
Hanqing Wang | Yajing Luo | Boya Xiong | Guanhua Chen | Yun Chen
Findings of the Association for Computational Linguistics: EMNLP 2023
Stylistic headline generation is the task to generate a headline that not only summarizes the content of an article, but also reflects a desired style that attracts users. As style-specific article-headline pairs are scarce, previous researches focus on unsupervised approaches with a standard headline generation dataset and mono-style corpora. In this work, we follow this line and propose StyleBART, an unsupervised approach for stylistic headline generation. Our method decorates the pretrained BART model with adapters that are responsible for different styles and allows the generation of headlines with diverse styles by simply switching the adapters. Different from previous works, StyleBART separates the task of style learning and headline generation, making it possible to freely combine the base model and the style adapters during inference. We further propose an inverse paraphrasing task to enhance the style adapters. Extensive automatic and human evaluations show that StyleBART achieves new state-of-the-art performance in the unsupervised stylistic headline generation task, producing high-quality headlines with the desired style.