Yaohong Jin


2020

pdf bib
BERT-EMD: Many-to-Many Layer Mapping for BERT Compression with Earth Mover’s Distance
Jianquan Li | Xiaokang Liu | Honghong Zhao | Ruifeng Xu | Min Yang | Yaohong Jin
Proceedings of the 2020 Conference on Empirical Methods in Natural Language Processing (EMNLP)

Pre-trained language models (e.g., BERT) have achieved significant success in various natural language processing (NLP) tasks. However, high storage and computational costs obstruct pre-trained language models to be effectively deployed on resource-constrained devices. In this paper, we propose a novel BERT distillation method based on many-to-many layer mapping, which allows each intermediate student layer to learn from any intermediate teacher layers. In this way, our model can learn from different teacher layers adaptively for different NLP tasks. In addition, we leverage Earth Mover’s Distance (EMD) to compute the minimum cumulative cost that must be paid to transform knowledge from teacher network to student network. EMD enables effective matching for the many-to-many layer mapping. Furthermore, we propose a cost attention mechanism to learn the layer weights used in EMD automatically, which is supposed to further improve the model’s performance and accelerate convergence time. Extensive experiments on GLUE benchmark demonstrate that our model achieves competitive performance compared to strong competitors in terms of both accuracy and model compression

2017

pdf bib
Translating Implicit Discourse Connectives Based on Cross-lingual Annotation and Alignment
Hongzheng Li | Philippe Langlais | Yaohong Jin
Proceedings of the Third Workshop on Discourse in Machine Translation

Implicit discourse connectives and relations are distributed more widely in Chinese texts, when translating into English, such connectives are usually translated explicitly. Towards Chinese-English MT, in this paper we describe cross-lingual annotation and alignment of dis-course connectives in a parallel corpus, describing related surveys and findings. We then conduct some evaluation experiments to testify the translation of implicit connectives and whether representing implicit connectives explicitly in source language can improve the final translation performance significantly. Preliminary results show it has little improvement by just inserting explicit connectives for implicit relations.

2015

pdf bib
Identifying Prepositional Phrases in Chinese Patent Texts with Rule-based and CRF Methods
Hongzheng Li | Yaohong Jin
Proceedings of the 29th Pacific Asia Conference on Language, Information and Computation

pdf bib
A hybrid system for Chinese-English patent machine translation
Hongzheng Li | Kai Zhao | Renfen Hu | Yun Zhu | Yaohong Jin
Proceedings of the 6th Workshop on Patent and Scientific Literature Translation

pdf bib
A CRF Method of Identifying Prepositional Phrases in Chinese Patent Texts
Hongzheng Li | Yaohong Jin
Proceedings of the Eighth SIGHAN Workshop on Chinese Language Processing

2014

pdf bib
Pre-reordering Model of Chinese Special Sentences for Patent Machine Translation
Renfen Hu | Zhiying Liu | Lijiao Yang | Yaohong Jin
Proceedings of the COLING Workshop on Synchronic and Diachronic Approaches to Analyzing Technical Language

pdf bib
Local Phrase Reordering Model for Chinese-English Patent Machine Translation
Xiaodie Liu | Yun Zhu | Yaohong Jin
Proceedings of The Third CIPS-SIGHAN Joint Conference on Chinese Language Processing

2006

pdf bib
Semantic Analysis of Chinese Garden-Path Sentences
Yaohong Jin
Proceedings of the Fifth SIGHAN Workshop on Chinese Language Processing