Constituency Lattice Encoding for Aspect Term Extraction

Yunyi Yang, Kun Li, Xiaojun Quan, Weizhou Shen, Qinliang Su


Abstract
One of the remaining challenges for aspect term extraction in sentiment analysis resides in the extraction of phrase-level aspect terms, which is non-trivial to determine the boundaries of such terms. In this paper, we aim to address this issue by incorporating the span annotations of constituents of a sentence to leverage the syntactic information in neural network models. To this end, we first construct a constituency lattice structure based on the constituents of a constituency tree. Then, we present two approaches to encoding the constituency lattice using BiLSTM-CRF and BERT as the base models, respectively. We experimented on two benchmark datasets to evaluate the two models, and the results confirm their superiority with respective 3.17 and 1.35 points gained in F1-Measure over the current state of the art. The improvements justify the effectiveness of the constituency lattice for aspect term extraction.
Anthology ID:
2020.coling-main.73
Volume:
Proceedings of the 28th International Conference on Computational Linguistics
Month:
December
Year:
2020
Address:
Barcelona, Spain (Online)
Editors:
Donia Scott, Nuria Bel, Chengqing Zong
Venue:
COLING
SIG:
Publisher:
International Committee on Computational Linguistics
Note:
Pages:
844–855
Language:
URL:
https://aclanthology.org/2020.coling-main.73
DOI:
10.18653/v1/2020.coling-main.73
Bibkey:
Cite (ACL):
Yunyi Yang, Kun Li, Xiaojun Quan, Weizhou Shen, and Qinliang Su. 2020. Constituency Lattice Encoding for Aspect Term Extraction. In Proceedings of the 28th International Conference on Computational Linguistics, pages 844–855, Barcelona, Spain (Online). International Committee on Computational Linguistics.
Cite (Informal):
Constituency Lattice Encoding for Aspect Term Extraction (Yang et al., COLING 2020)
Copy Citation:
PDF:
https://aclanthology.org/2020.coling-main.73.pdf
Code
 leekum2018/cle4ate
Data
Penn TreebankSemEval-2014 Task-4