Lattice-BERT: Leveraging Multi-Granularity Representations in Chinese Pre-trained Language Models Yuxuan Lai author Yijia Liu author Yansong Feng author Songfang Huang author Dongyan Zhao author 2021-06 text Proceedings of the 2021 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies Kristina Toutanova editor Anna Rumshisky editor Luke Zettlemoyer editor Dilek Hakkani-Tur editor Iz Beltagy editor Steven Bethard editor Ryan Cotterell editor Tanmoy Chakraborty editor Yichao Zhou editor Association for Computational Linguistics Online conference publication lai-etal-2021-lattice 10.18653/v1/2021.naacl-main.137 https://aclanthology.org/2021.naacl-main.137/ 2021-06 1716 1731