A Span-based Dynamic Local Attention Model for Sequential Sentence Classification

Xichen Shang, Qianli Ma, Zhenxi Lin, Jiangyue Yan, Zipeng Chen


Abstract
Sequential sentence classification aims to classify each sentence in the document based on the context in which sentences appear. Most existing work addresses this problem using a hierarchical sequence labeling network. However, they ignore considering the latent segment structure of the document, in which contiguous sentences often have coherent semantics. In this paper, we proposed a span-based dynamic local attention model that could explicitly capture the structural information by the proposed supervised dynamic local attention. We further introduce an auxiliary task called span-based classification to explore the span-level representations. Extensive experiments show that our model achieves better or competitive performance against state-of-the-art baselines on two benchmark datasets.
Anthology ID:
2021.acl-short.26
Volume:
Proceedings of the 59th Annual Meeting of the Association for Computational Linguistics and the 11th International Joint Conference on Natural Language Processing (Volume 2: Short Papers)
Month:
August
Year:
2021
Address:
Online
Editors:
Chengqing Zong, Fei Xia, Wenjie Li, Roberto Navigli
Venues:
ACL | IJCNLP
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
198–203
Language:
URL:
https://aclanthology.org/2021.acl-short.26
DOI:
10.18653/v1/2021.acl-short.26
Bibkey:
Cite (ACL):
Xichen Shang, Qianli Ma, Zhenxi Lin, Jiangyue Yan, and Zipeng Chen. 2021. A Span-based Dynamic Local Attention Model for Sequential Sentence Classification. In Proceedings of the 59th Annual Meeting of the Association for Computational Linguistics and the 11th International Joint Conference on Natural Language Processing (Volume 2: Short Papers), pages 198–203, Online. Association for Computational Linguistics.
Cite (Informal):
A Span-based Dynamic Local Attention Model for Sequential Sentence Classification (Shang et al., ACL-IJCNLP 2021)
Copy Citation:
PDF:
https://aclanthology.org/2021.acl-short.26.pdf
Optional supplementary material:
 2021.acl-short.26.OptionalSupplementaryMaterial.pdf
Video:
 https://aclanthology.org/2021.acl-short.26.mp4