Yaqi Wang


2024

pdf bib
Reusing Transferable Weight Increments for Low-resource Style Generation
Chunzhen Jin | Eliot Huang | Heng Chang | Yaqi Wang | Peng Cao | Osmar Zaiane
Proceedings of the 2024 Conference on Empirical Methods in Natural Language Processing

Text style transfer (TST) is crucial in natural language processing, aiming to endow text with a new style without altering its meaning. In real-world scenarios, not all styles have abundant resources. This work introduces TWIST (reusing Transferable Weight Increments for Style Text generation), a novel framework to mitigate data scarcity by utilizing style features in weight increments to transfer low-resource styles effectively. During target style learning, we derive knowledge via a specially designed weight pool and initialize the parameters for the unseen style. To enhance the effectiveness of merging, the target style weight increments are often merged from multiple source style weight increments through singular vectors. Considering the diversity of styles, we also designed a multi-key memory network that simultaneously focuses on task- and instance-level information to derive the most relevant weight increments. Results from multiple style transfer datasets show that TWIST demonstrates remarkable performance across different backbones, achieving particularly effective results in low-resource scenarios.

2015

pdf bib
NDMSCS: A Topic-Based Chinese Microblog Polarity Classification System
Yang Wang | Yaqi Wang | Shi Feng | Daling Wang | Yifei Zhang
Proceedings of the Eighth SIGHAN Workshop on Chinese Language Processing

pdf bib
NEUDM: A System for Topic-Based Message Polarity Classification
Yaqi Wang | Shi Feng | Daling Wang | Yifei Zhang
Proceedings of the Eighth SIGHAN Workshop on Chinese Language Processing