2019
pdf
bib
abs
A Prism Module for Semantic Disentanglement in Name Entity Recognition
Kun Liu
|
Shen Li
|
Daqi Zheng
|
Zhengdong Lu
|
Sheng Gao
|
Si Li
Proceedings of the 57th Annual Meeting of the Association for Computational Linguistics
Natural Language Processing has been perplexed for many years by the problem that multiple semantics are mixed inside a word, even with the help of context. To solve this problem, we propose a prism module to disentangle the semantic aspects of words and reduce noise at the input layer of a model. In the prism module, some words are selectively replaced with task-related semantic aspects, then these denoised word representations can be fed into downstream tasks to make them easier. Besides, we also introduce a structure to train this module jointly with the downstream model without additional data. This module can be easily integrated into the downstream model and significantly improve the performance of baselines on named entity recognition (NER) task. The ablation analysis demonstrates the rationality of the method. As a side effect, the proposed method also provides a way to visualize the contribution of each word.
2018
pdf
bib
abs
Object-oriented Neural Programming (OONP) for Document Understanding
Zhengdong Lu
|
Xianggen Liu
|
Haotian Cui
|
Yukun Yan
|
Daqi Zheng
Proceedings of the 56th Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers)
We propose Object-oriented Neural Programming (OONP), a framework for semantically parsing documents in specific domains. Basically, OONP reads a document and parses it into a predesigned object-oriented data structure that reflects the domain-specific semantics of the document. An OONP parser models semantic parsing as a decision process: a neural net-based Reader sequentially goes through the document, and builds and updates an intermediate ontology during the process to summarize its partial understanding of the text. OONP supports a big variety of forms (both symbolic and differentiable) for representing the state and the document, and a rich family of operations to compose the representation. An OONP parser can be trained with supervision of different forms and strength, including supervised learning (SL), reinforcement learning (RL) and hybrid of the two. Our experiments on both synthetic and real-world document parsing tasks have shown that OONP can learn to handle fairly complicated ontology with training data of modest sizes.
2011
pdf
bib
Maximum Rank Correlation Training for Statistical Machine Translation
Daqi Zheng
|
Yifan He
|
Yang Liu
|
Qun Liu
Proceedings of Machine Translation Summit XIII: Papers
2009
pdf
bib
abs
The ICT statistical machine translation system for the IWSLT 2009
Haitao Mi
|
Yang Li
|
Tian Xia
|
Xinyan Xiao
|
Yang Feng
|
Jun Xie
|
Hao Xiong
|
Zhaopeng Tu
|
Daqi Zheng
|
Yanjuan Lu
|
Qun Liu
Proceedings of the 6th International Workshop on Spoken Language Translation: Evaluation Campaign
This paper describes the ICT Statistical Machine Translation systems that used in the evaluation campaign of the International Workshop on Spoken Language Translation (IWSLT) 2009. For this year’s evaluation, we participated in the Challenge Task (Chinese-English and English-Chinese) and BTEC Task (Chinese-English). And we mainly focus on one new method to improve single system’s translation quality. Specifically, we developed a sentence-similarity based development set selection technique. For each task, we finally submitted the single system who got the maximum BLEU scores on the selected development set. The four single translation systems are based on different techniques: a linguistically syntax-based system, two formally syntax-based systems and a phrase-based system. Typically, we didn’t use any rescoring or system combination techniques in this year’s evaluation.