SLPL-Sentiment at SemEval-2022 Task 10: Making Use of Pre-Trained Model’s Attention Values in Structured Sentiment Analysis

Sadrodin Barikbin


Abstract
Sentiment analysis is a useful problem which could serve a variety of fields from business intelligence to social studies and even health studies. Using SemEval 2022 Task 10 formulation of this problem and taking sequence labeling as our approach, we propose a model which learns the task by finetuning a pretrained transformer, introducing as few parameters (~150k) as possible and making use of precomputed attention values in the transformer. Our model improves shared task baselines on all task datasets.
Anthology ID:
2022.semeval-1.192
Volume:
Proceedings of the 16th International Workshop on Semantic Evaluation (SemEval-2022)
Month:
July
Year:
2022
Address:
Seattle, United States
Editors:
Guy Emerson, Natalie Schluter, Gabriel Stanovsky, Ritesh Kumar, Alexis Palmer, Nathan Schneider, Siddharth Singh, Shyam Ratan
Venue:
SemEval
SIG:
SIGLEX
Publisher:
Association for Computational Linguistics
Note:
Pages:
1382–1388
Language:
URL:
https://aclanthology.org/2022.semeval-1.192
DOI:
10.18653/v1/2022.semeval-1.192
Bibkey:
Cite (ACL):
Sadrodin Barikbin. 2022. SLPL-Sentiment at SemEval-2022 Task 10: Making Use of Pre-Trained Model’s Attention Values in Structured Sentiment Analysis. In Proceedings of the 16th International Workshop on Semantic Evaluation (SemEval-2022), pages 1382–1388, Seattle, United States. Association for Computational Linguistics.
Cite (Informal):
SLPL-Sentiment at SemEval-2022 Task 10: Making Use of Pre-Trained Model’s Attention Values in Structured Sentiment Analysis (Barikbin, SemEval 2022)
Copy Citation:
PDF:
https://aclanthology.org/2022.semeval-1.192.pdf
Video:
 https://aclanthology.org/2022.semeval-1.192.mp4
Data
MPQA Opinion CorpusMultiBooked