DomBERT: Domain-oriented Language Model for Aspect-based Sentiment Analysis

Hu Xu, Bing Liu, Lei Shu, Philip Yu


Abstract
This paper focuses on learning domain-oriented language models driven by end tasks, which aims to combine the worlds of both general-purpose language models (such as ELMo and BERT) and domain-specific language understanding. We propose DomBERT, an extension of BERT to learn from both in-domain corpus and relevant domain corpora. This helps in learning domain language models with low-resources. Experiments are conducted on an assortment of tasks in aspect-based sentiment analysis (ABSA), demonstrating promising results.
Anthology ID:
2020.findings-emnlp.156
Volume:
Findings of the Association for Computational Linguistics: EMNLP 2020
Month:
November
Year:
2020
Address:
Online
Editors:
Trevor Cohn, Yulan He, Yang Liu
Venue:
Findings
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
1725–1731
Language:
URL:
https://aclanthology.org/2020.findings-emnlp.156
DOI:
10.18653/v1/2020.findings-emnlp.156
Bibkey:
Cite (ACL):
Hu Xu, Bing Liu, Lei Shu, and Philip Yu. 2020. DomBERT: Domain-oriented Language Model for Aspect-based Sentiment Analysis. In Findings of the Association for Computational Linguistics: EMNLP 2020, pages 1725–1731, Online. Association for Computational Linguistics.
Cite (Informal):
DomBERT: Domain-oriented Language Model for Aspect-based Sentiment Analysis (Xu et al., Findings 2020)
Copy Citation:
PDF:
https://aclanthology.org/2020.findings-emnlp.156.pdf