Yao Meng


2020

pdf bib
A Learning-Exploring Method to Generate Diverse Paraphrases with Multi-Objective Deep Reinforcement Learning
Mingtong Liu | Erguang Yang | Deyi Xiong | Yujie Zhang | Yao Meng | Changjian Hu | Jinan Xu | Yufeng Chen
Proceedings of the 28th International Conference on Computational Linguistics

Paraphrase generation (PG) is of great importance to many downstream tasks in natural language processing. Diversity is an essential nature to PG for enhancing generalization capability and robustness of downstream applications. Recently, neural sequence-to-sequence (Seq2Seq) models have shown promising results in PG. However, traditional model training for PG focuses on optimizing model prediction against single reference and employs cross-entropy loss, which objective is unable to encourage model to generate diverse paraphrases. In this work, we present a novel approach with multi-objective learning to PG. We propose a learning-exploring method to generate sentences as learning objectives from the learned data distribution, and employ reinforcement learning to combine these new learning objectives for model training. We first design a sample-based algorithm to explore diverse sentences. Then we introduce several reward functions to evaluate the sampled sentences as learning signals in terms of expressive diversity and semantic fidelity, aiming to generate diverse and high-quality paraphrases. To effectively optimize model performance satisfying different evaluating aspects, we use a GradNorm-based algorithm that automatically balances these training objectives. Experiments and analyses on Quora and Twitter datasets demonstrate that our proposed method not only gains a significant increase in diversity but also improves generation quality over several state-of-the-art baselines.

pdf bib
Balanced Joint Adversarial Training for Robust Intent Detection and Slot Filling
Xu Cao | Deyi Xiong | Chongyang Shi | Chao Wang | Yao Meng | Changjian Hu
Proceedings of the 28th International Conference on Computational Linguistics

Joint intent detection and slot filling has recently achieved tremendous success in advancing the performance of utterance understanding. However, many joint models still suffer from the robustness problem, especially on noisy inputs or rare/unseen events. To address this issue, we propose a Joint Adversarial Training (JAT) model to improve the robustness of joint intent detection and slot filling, which consists of two parts: (1) automatically generating joint adversarial examples to attack the joint model, and (2) training the model to defend against the joint adversarial examples so as to robustify the model on small perturbations. As the generated joint adversarial examples have different impacts on the intent detection and slot filling loss, we further propose a Balanced Joint Adversarial Training (BJAT) model that applies a balance factor as a regularization term to the final loss function, which yields a stable training procedure. Extensive experiments and analyses on the lightweight models show that our proposed methods achieve significantly higher scores and substantially improve the robustness of both intent detection and slot filling. In addition, the combination of our BJAT with BERT-large achieves state-of-the-art results on two datasets.

pdf bib
Bootstrapping Named Entity Recognition in E-Commerce with Positive Unlabeled Learning
Hanchu Zhang | Leonhard Hennig | Christoph Alt | Changjian Hu | Yao Meng | Chao Wang
Proceedings of The 3rd Workshop on e-Commerce and NLP

In this work, we introduce a bootstrapped, iterative NER model that integrates a PU learning algorithm for recognizing named entities in a low-resource setting. Our approach combines dictionary-based labeling with syntactically-informed label expansion to efficiently enrich the seed dictionaries. Experimental results on a dataset of manually annotated e-commerce product descriptions demonstrate the effectiveness of the proposed framework.

2016

pdf bib
Automatic Identifying Entity Type in Linked Data
Qingliang Miao | Ruiyu Fang | Shuangyong Song | Zhongguang Zheng | Lu Fang | Yao Meng | Jun Sun
Proceedings of the 30th Pacific Asia Conference on Language, Information and Computation: Posters

pdf bib
A Distribution-based Model to Learn Bilingual Word Embeddings
Hailong Cao | Tiejun Zhao | Shu Zhang | Yao Meng
Proceedings of COLING 2016, the 26th International Conference on Computational Linguistics: Technical Papers

We introduce a distribution based model to learn bilingual word embeddings from monolingual data. It is simple, effective and does not require any parallel data or any seed lexicon. We take advantage of the fact that word embeddings are usually in form of dense real-valued low-dimensional vector and therefore the distribution of them can be accurately estimated. A novel cross-lingual learning objective is proposed which directly matches the distributions of word embeddings in one language with that in the other language. During the joint learning process, we dynamically estimate the distributions of word embeddings in two languages respectively and minimize the dissimilarity between them through standard back propagation algorithm. Our learned bilingual word embeddings allow to group each word and its translations together in the shared vector space. We demonstrate the utility of the learned embeddings on the task of finding word-to-word translations from monolingual corpora. Our model achieved encouraging performance on data in both related languages and substantially different languages.

2013

pdf bib
Semi-supervised Classification of Twitter Messages for Organization Name Disambiguation
Shu Zhang | Jianwei Wu | Dequan Zheng | Yao Meng | Hao Yu
Proceedings of the Sixth International Joint Conference on Natural Language Processing

pdf bib
Cross-Lingual Link Discovery between Chinese and English Wiki Knowledge Bases
Qingliang Miao | Huayu Lu | Shu Zhang | Yao Meng
Proceedings of the 27th Pacific Asia Conference on Language, Information, and Computation (PACLIC 27)

2012

pdf bib
Improving Chinese-to-Japanese Patent Translation Using English as Pivot Language
Xianhua Li | Yao Meng | Hao Yu
Proceedings of the 26th Pacific Asia Conference on Language, Information, and Computation

pdf bib
An Adaptive Method for Organization Name Disambiguation with Feature Reinforcing
Shu Zhang | Jianwei Wu | Dequan Zheng | Yao Meng | Hao Yu
Proceedings of the 26th Pacific Asia Conference on Language, Information, and Computation

2011

pdf bib
Maximum Entropy Based Lexical Reordering Model for Hierarchical Phrase-based Machine Translation
Zhongguang Zheng | Yao Meng | Hao Yu
Proceedings of the 25th Pacific Asia Conference on Language, Information and Computation

pdf bib
Lexical-based Reordering Model for Hierarchical Phrase-based Machine Translation
Zhongguang Zheng | Yao Meng | Hao Yu
Proceedings of Machine Translation Summit XIII: Papers

pdf bib
Feedback Selecting of Manually Acquired Rules Using Automatic Evaluation
Xianhua Li | Yajuan Lü | Yao Meng | Qun Liu | Hao Yu
Proceedings of the 4th Workshop on Patent Translation

2010

pdf bib
Fault-Tolerant Learning for Term Extraction
Yuhang Yang | Hao Yu | Yao Meng | Yingliang Lu | Yingju Xia
Proceedings of the 24th Pacific Asia Conference on Language, Information and Computation

pdf bib
Extending the Hierarchical Phrase Based Model with Maximum Entropy Based BTG
Zhongjun He | Yao Meng | Hao Yu
Proceedings of the 9th Conference of the Association for Machine Translation in the Americas: Research Papers

In the hierarchical phrase based (HPB) translation model, in addition to hierarchical phrase pairs extracted from bi-text, glue rules are used to perform serial combination of phrases. However, this basic method for combining phrases is not sufficient for phrase reordering. In this paper, we extend the HPB model with maximum entropy based bracketing transduction grammar (BTG), which provides content-dependent combination of neighboring phrases in two ways: serial or inverse. Experimental results show that the extended HPB system achieves absolute improvements of 0.9∼1.8 BLEU points over the baseline for large-scale translation tasks.

pdf bib
Extracting Product Features and Sentiments from Chinese Customer Reviews
Shu Zhang | Wenjie Jia | Yingju Xia | Yao Meng | Hao Yu
Proceedings of the Seventh International Conference on Language Resources and Evaluation (LREC'10)

With the growing interest in opinion mining from web data, more works are focused on mining in English and Chinese reviews. Probing into the problem of product opinion mining, this paper describes the details of our language resources, and imports them into the task of extracting product feature and sentiment task. Different from the traditional unsupervised methods, a supervised method is utilized to identify product features, combining the domain knowledge and lexical information. Nearest vicinity match and syntactic tree based methods are proposed to identify the opinions regarding the product features. Multi-level analysis module is proposed to determine the sentiment orientation of the opinions. With the experiments on the electronic reviews of COAE 2008, the validities of the product features identified by CRFs and the two opinion words identified methods are testified and compared. The results show the resource is well utilized in this task and our proposed method is valid.

pdf bib
Maximum Entropy Based Phrase Reordering for Hierarchical Phrase-Based Translation
Zhongjun He | Yao Meng | Hao Yu
Proceedings of the 2010 Conference on Empirical Methods in Natural Language Processing

pdf bib
Learning Phrase Boundaries for Hierarchical Phrase-based Translation
Zhongjun He | Yao Meng | Hao Yu
Coling 2010: Posters

2009

pdf bib
Reducing SMT Rule Table with Monolingual Key Phrase
Zhongjun He | Yao Meng | Yajuan Lü | Hao Yu | Qun Liu
Proceedings of the ACL-IJCNLP 2009 Conference Short Papers

pdf bib
A Bootstrapping Method for Finer-Grained Opinion Mining Using Graph Model
Shu Zhang | Yingju Xia | Yao Meng | Hao Yu
Proceedings of the 23rd Pacific Asia Conference on Language, Information and Computation, Volume 2

2007

pdf bib
A Conversational In-Car Dialog System
Baoshi Yan | Fuliang Weng | Zhe Feng | Florin Ratiu | Madhuri Raya | Yao Meng | Sebastian Varges | Matthew Purver | Annie Lien | Tobias Scheideck | Badri Raghunathan | Feng Lin | Rohit Mishra | Brian Lathrop | Zhaoxia Zhang | Harry Bratt | Stanley Peters
Proceedings of Human Language Technologies: The Annual Conference of the North American Chapter of the Association for Computational Linguistics (NAACL-HLT)

pdf bib
CHAT to Your Destination
Fuliang Weng | Baoshi Yan | Zhe Feng | Florin Ratiu | Madhuri Raya | Brian Lathrop | Annie Lien | Sebastian Varges | Rohit Mishra | Feng Lin | Matthew Purver | Harry Bratt | Yao Meng | Stanley Peters | Tobias Scheideck | Badri Raghunathan | Zhaoxia Zhang
Proceedings of the 8th SIGdial Workshop on Discourse and Dialogue

2005

pdf bib
A Lexicon-Constrained Character Model for Chinese Morphological Analysis
Yao Meng | Hao Yu | Fumihito Nishino
Second International Joint Conference on Natural Language Processing: Full Papers