Domain Adaptation for Sentiment Analysis Using Robust Internal Representations

Mohammad Rostami, Digbalay Bose, Shrikanth Narayanan, Aram Galstyan


Abstract
Sentiment analysis is a costly yet necessary task for enterprises to study the opinions of their customers to improve their products and to determine optimal marketing strategies. Due to the existence of a wide range of domains across different products and services, cross-domain sentiment analysis methods have received significant attention. These methods mitigate the domain gap between different applications by training cross-domain generalizable classifiers which relax the need for data annotation for each domain. We develop a domain adaptation method which induces large margins between data representations that belong to different classes in an embedding space. This embedding space is trained to be domain-agnostic by matching the data distributions across the domains. Large interclass margins in the source domain help to reduce the effect of “domain shift” in the target domain. Theoretical and empirical analysis are provided to demonstrate that the proposed method is effective.
Anthology ID:
2023.findings-emnlp.769
Volume:
Findings of the Association for Computational Linguistics: EMNLP 2023
Month:
December
Year:
2023
Address:
Singapore
Editors:
Houda Bouamor, Juan Pino, Kalika Bali
Venue:
Findings
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
11484–11498
Language:
URL:
https://aclanthology.org/2023.findings-emnlp.769
DOI:
10.18653/v1/2023.findings-emnlp.769
Bibkey:
Cite (ACL):
Mohammad Rostami, Digbalay Bose, Shrikanth Narayanan, and Aram Galstyan. 2023. Domain Adaptation for Sentiment Analysis Using Robust Internal Representations. In Findings of the Association for Computational Linguistics: EMNLP 2023, pages 11484–11498, Singapore. Association for Computational Linguistics.
Cite (Informal):
Domain Adaptation for Sentiment Analysis Using Robust Internal Representations (Rostami et al., Findings 2023)
Copy Citation:
PDF:
https://aclanthology.org/2023.findings-emnlp.769.pdf