Train Once for All: A Transitional Approach for Efficient Aspect Sentiment Triplet Extraction

Xinmeng Hou, Lingyue Fu, Chenhao Meng, Kounianhua Du, Hai Hu


Abstract
Aspect-Opinion Pair Extraction (AOPE) and Aspect Sentiment Triplet Extraction (ASTE) have drawn growing attention in NLP. However, most existing approaches extract aspects and opinions independently, optionally adding pairwise relations, often leading to error propagation and high time complexity. To address these challenges and being inspired by transition-based dependency parsing, we propose the first transition-based model for AOPE and ASTE that performs aspect and opinion extraction jointly, which also better captures position-aware aspect-opinion relations and mitigates entity-level bias. By integrating contrastive-augmented optimization, our model delivers more accurate action predictions and jointly optimizes separate subtasks in linear time. Extensive experiments on four commonly used ASTE/AOPE datasets show that, our proposed transition-based model outperform previous models on two out of the four datasets when trained on a single dataset. When multiple training sets are used, our proposed method achieves new state-of-the-art results on all datasets. We show that this is partly due to our model’s ability to benefit from transition actions learned from multiple datasets and domains.Our code is available at https://github.com/Paparare/trans_aste.
Anthology ID:
2025.findings-emnlp.355
Volume:
Findings of the Association for Computational Linguistics: EMNLP 2025
Month:
November
Year:
2025
Address:
Suzhou, China
Editors:
Christos Christodoulopoulos, Tanmoy Chakraborty, Carolyn Rose, Violet Peng
Venue:
Findings
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
6706–6719
Language:
URL:
https://aclanthology.org/2025.findings-emnlp.355/
DOI:
Bibkey:
Cite (ACL):
Xinmeng Hou, Lingyue Fu, Chenhao Meng, Kounianhua Du, and Hai Hu. 2025. Train Once for All: A Transitional Approach for Efficient Aspect Sentiment Triplet Extraction. In Findings of the Association for Computational Linguistics: EMNLP 2025, pages 6706–6719, Suzhou, China. Association for Computational Linguistics.
Cite (Informal):
Train Once for All: A Transitional Approach for Efficient Aspect Sentiment Triplet Extraction (Hou et al., Findings 2025)
Copy Citation:
PDF:
https://aclanthology.org/2025.findings-emnlp.355.pdf
Checklist:
 2025.findings-emnlp.355.checklist.pdf