Unsupervised Data Augmentation for Aspect Based Sentiment Analysis

David Z. Chen, Adam Faulkner, Sahil Badyal


Abstract
Recent approaches to Aspect-based Sentiment Analysis (ABSA) take a co-extraction approach to this span-level classification task, performing the subtasks of aspect term extraction (ATE) and aspect sentiment classification (ASC) simultaneously. In this work, we build on recent progress in applying pre-training to this co-extraction task with the introduction of an adaptation of Unsupervised Data Augmentation in semi-supervised learning. As originally implemented, UDA cannot accommodate span-level classification since it relies on advanced data augmentation techniques, such as back-translation, that alter the sequence lengths of the original data and cause index mismatches. We introduce an adaptation of UDA using Masked Language Model (MLM) unmasking that accommodates this index-match constraint and test the approach on standard ABSA benchmark datasets. We show that simple augmentations applied to modest-sized datasets along with consistency training lead to competitive performance with the current ABSA state-of-the-art in the restaurant and laptop domains using only 75% of the training data.
Anthology ID:
2022.coling-1.586
Volume:
Proceedings of the 29th International Conference on Computational Linguistics
Month:
October
Year:
2022
Address:
Gyeongju, Republic of Korea
Venue:
COLING
SIG:
Publisher:
International Committee on Computational Linguistics
Note:
Pages:
6746–6751
Language:
URL:
https://aclanthology.org/2022.coling-1.586
DOI:
Bibkey:
Cite (ACL):
David Z. Chen, Adam Faulkner, and Sahil Badyal. 2022. Unsupervised Data Augmentation for Aspect Based Sentiment Analysis. In Proceedings of the 29th International Conference on Computational Linguistics, pages 6746–6751, Gyeongju, Republic of Korea. International Committee on Computational Linguistics.
Cite (Informal):
Unsupervised Data Augmentation for Aspect Based Sentiment Analysis (Chen et al., COLING 2022)
Copy Citation:
PDF:
https://aclanthology.org/2022.coling-1.586.pdf