Lexicon Enhanced Chinese Sequence Labeling Using BERT Adapter

Wei Liu, Xiyan Fu, Yue Zhang, Wenming Xiao


Abstract
Lexicon information and pre-trained models, such as BERT, have been combined to explore Chinese sequence labeling tasks due to their respective strengths. However, existing methods solely fuse lexicon features via a shallow and random initialized sequence layer and do not integrate them into the bottom layers of BERT. In this paper, we propose Lexicon Enhanced BERT (LEBERT) for Chinese sequence labeling, which integrates external lexicon knowledge into BERT layers directly by a Lexicon Adapter layer. Compared with existing methods, our model facilitates deep lexicon knowledge fusion at the lower layers of BERT. Experiments on ten Chinese datasets of three tasks including Named Entity Recognition, Word Segmentation, and Part-of-Speech Tagging, show that LEBERT achieves state-of-the-art results.
Anthology ID:
2021.acl-long.454
Original:
2021.acl-long.454v1
Version 2:
2021.acl-long.454v2
Volume:
Proceedings of the 59th Annual Meeting of the Association for Computational Linguistics and the 11th International Joint Conference on Natural Language Processing (Volume 1: Long Papers)
Month:
August
Year:
2021
Address:
Online
Editors:
Chengqing Zong, Fei Xia, Wenjie Li, Roberto Navigli
Venues:
ACL | IJCNLP
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
5847–5858
Language:
URL:
https://aclanthology.org/2021.acl-long.454
DOI:
10.18653/v1/2021.acl-long.454
Bibkey:
Cite (ACL):
Wei Liu, Xiyan Fu, Yue Zhang, and Wenming Xiao. 2021. Lexicon Enhanced Chinese Sequence Labeling Using BERT Adapter. In Proceedings of the 59th Annual Meeting of the Association for Computational Linguistics and the 11th International Joint Conference on Natural Language Processing (Volume 1: Long Papers), pages 5847–5858, Online. Association for Computational Linguistics.
Cite (Informal):
Lexicon Enhanced Chinese Sequence Labeling Using BERT Adapter (Liu et al., ACL-IJCNLP 2021)
Copy Citation:
PDF:
https://aclanthology.org/2021.acl-long.454.pdf
Video:
 https://aclanthology.org/2021.acl-long.454.mp4
Code
 liuwei1206/LEBERT
Data
Resume NERUniversal DependenciesWeibo NER