Dingxian Wang
2023
AMR-TST: Abstract Meaning Representation-based Text Style Transfer
Kaize Shi
|
Xueyao Sun
|
Li He
|
Dingxian Wang
|
Qing Li
|
Guandong Xu
Findings of the Association for Computational Linguistics: ACL 2023
Abstract Meaning Representation (AMR) is a semantic representation that can enhance natural language generation (NLG) by providing a logical semantic input. In this paper, we propose the AMR-TST, an AMR-based text style transfer (TST) technique. The AMR-TST converts the source text to an AMR graph and generates the transferred text based on the AMR graph modified by a TST policy named style rewriting. Our method combines both the explainability and diversity of explicit and implicit TST methods. The experiments show that the proposed method achieves state-of-the-art results compared with other baseline models in automatic and human evaluations. The generated transferred text in qualitative evaluation proves the AMR-TST have significant advantages in keeping semantic features and reducing hallucinations. To the best of our knowledge, this work is the first to apply the AMR method focusing on node-level features to the TST task.
2020
Graph-based Aspect Representation Learning for Entity Resolution
Zhenqi Zhao
|
Yuchen Guo
|
Dingxian Wang
|
Yufan Huang
|
Xiangnan He
|
Bin Gu
Proceedings of the Graph-based Methods for Natural Language Processing (TextGraphs)
Entity Resolution (ER) identifies records that refer to the same real-world entity. Deep learning approaches improved the generalization ability of entity matching models, but hardly overcame the impact of noisy or incomplete data sources. In real scenes, an entity usually consists of multiple semantic facets, called aspects. In this paper, we focus on entity augmentation, namely retrieving the values of missing aspects. The relationship between aspects is naturally suitable to be represented by a knowledge graph, where entity augmentation can be modeled as a link prediction problem. Our paper proposes a novel graph-based approach to solve entity augmentation. Specifically, we apply a dedicated random walk algorithm, which uses node types to limit the traversal length, and encodes graph structure into low-dimensional embeddings. Thus, the missing aspects could be retrieved by a link prediction model. Furthermore, the augmented aspects with fixed orders are served as the input of a deep Siamese BiLSTM network for entity matching. We compared our method with state-of-the-art methods through extensive experiments on downstream ER tasks. According to the experiment results, our model outperforms other methods on evaluation metrics (accuracy, precision, recall, and f1-score) to a large extent, which demonstrates the effectiveness of our method.
Search
Co-authors
- Kaize Shi 1
- Xueyao Sun 1
- Li He 1
- Qing Li 1
- Guandong Xu 1
- show all...