Understanding Pre-trained BERT for Aspect-based Sentiment Analysis

Hu Xu, Lei Shu, Philip Yu, Bing Liu


Abstract
This paper analyzes the pre-trained hidden representations learned from reviews on BERT for tasks in aspect-based sentiment analysis (ABSA). Our work is motivated by the recent progress in BERT-based language models for ABSA. However, it is not clear how the general proxy task of (masked) language model trained on unlabeled corpus without annotations of aspects or opinions can provide important features for downstream tasks in ABSA. By leveraging the annotated datasets in ABSA, we investigate both the attentions and the learned representations of BERT pre-trained on reviews. We found that BERT uses very few self-attention heads to encode context words (such as prepositions or pronouns that indicating an aspect) and opinion words for an aspect. Most features in the representation of an aspect are dedicated to the fine-grained semantics of the domain (or product category) and the aspect itself, instead of carrying summarized opinions from its context. We hope this investigation can help future research in improving self-supervised learning, unsupervised learning and fine-tuning for ABSA. The pre-trained model and code can be found at https://github.com/howardhsu/BERT-for-RRC-ABSA.
Anthology ID:
2020.coling-main.21
Volume:
Proceedings of the 28th International Conference on Computational Linguistics
Month:
December
Year:
2020
Address:
Barcelona, Spain (Online)
Editors:
Donia Scott, Nuria Bel, Chengqing Zong
Venue:
COLING
SIG:
Publisher:
International Committee on Computational Linguistics
Note:
Pages:
244–250
Language:
URL:
https://aclanthology.org/2020.coling-main.21
DOI:
10.18653/v1/2020.coling-main.21
Bibkey:
Cite (ACL):
Hu Xu, Lei Shu, Philip Yu, and Bing Liu. 2020. Understanding Pre-trained BERT for Aspect-based Sentiment Analysis. In Proceedings of the 28th International Conference on Computational Linguistics, pages 244–250, Barcelona, Spain (Online). International Committee on Computational Linguistics.
Cite (Informal):
Understanding Pre-trained BERT for Aspect-based Sentiment Analysis (Xu et al., COLING 2020)
Copy Citation:
PDF:
https://aclanthology.org/2020.coling-main.21.pdf
Code
 howardhsu/BERT-for-RRC-ABSA +  additional community code