A Robustly Optimized BMRC for Aspect Sentiment Triplet Extraction

Shu Liu, Kaiwen Li, Zuhe Li


Abstract
Aspect sentiment triplet extraction (ASTE) is a challenging subtask in aspect-based sentiment analysis. It aims to explore the triplets of aspects, opinions and sentiments with complex correspondence from the context. The bidirectional machine reading comprehension (BMRC), can effectively deal with ASTE task, but several problems remains, such as query conflict and probability unilateral decrease. Therefore, this paper presents a robustly optimized BMRC method by incorporating four improvements. The word segmentation is applied to facilitate the semantic learning. Exclusive classifiers are designed to avoid the interference between different queries. A span matching rule is proposed to select the aspects and opinions that better represent the expectations of the model. The probability generation strategy is also introduced to obtain the predicted probability for aspects, opinions and aspect-opinion pairs. We have conducted extensive experiments on multiple benchmark datasets, where our model achieves the state-of-the-art performance.
Anthology ID:
2022.naacl-main.20
Volume:
Proceedings of the 2022 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies
Month:
July
Year:
2022
Address:
Seattle, United States
Venue:
NAACL
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
272–278
Language:
URL:
https://aclanthology.org/2022.naacl-main.20
DOI:
10.18653/v1/2022.naacl-main.20
Bibkey:
Cite (ACL):
Shu Liu, Kaiwen Li, and Zuhe Li. 2022. A Robustly Optimized BMRC for Aspect Sentiment Triplet Extraction. In Proceedings of the 2022 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, pages 272–278, Seattle, United States. Association for Computational Linguistics.
Cite (Informal):
A Robustly Optimized BMRC for Aspect Sentiment Triplet Extraction (Liu et al., NAACL 2022)
Copy Citation:
PDF:
https://aclanthology.org/2022.naacl-main.20.pdf
Code
 itkaven/robmrc
Data
ASTE-Data-V2