Junehyung Kim
2024
All You Need is Attention: Lightweight Attention-based Data Augmentation for Text Classification
Junehyung Kim
|
Sungjae Hwang
Findings of the Association for Computational Linguistics: EMNLP 2024
This paper introduces LADAM, a novel method for enhancing the performance of text classification tasks. LADAM employs attention mechanisms to exchange semantically similar words between sentences. This approach generates a greater diversity of synthetic sentences compared to simpler operations like random insertions, while maintaining the context of the original sentences. Additionally, LADAM is an easy-to-use, lightweight technique that does not require external datasets or large language models. Our experimental results across five datasets demonstrate that LADAM consistently outperforms baseline methods across diverse text classification conditions.
Search