Jun He
2022
An MRC Framework for Semantic Role Labeling
Nan Wang
|
Jiwei Li
|
Yuxian Meng
|
Xiaofei Sun
|
Han Qiu
|
Ziyao Wang
|
Guoyin Wang
|
Jun He
Proceedings of the 29th International Conference on Computational Linguistics
Semantic Role Labeling (SRL) aims at recognizing the predicate-argument structure of a sentence and can be decomposed into two subtasks: predicate disambiguation and argument labeling. Prior work deals with these two tasks independently, which ignores the semantic connection between the two tasks. In this paper, we propose to use the machine reading comprehension (MRC) framework to bridge this gap. We formalize predicate disambiguation as multiple-choice machine reading comprehension, where the descriptions of candidate senses of a given predicate are used as options to select the correct sense. The chosen predicate sense is then used to determine the semantic roles for that predicate, and these semantic roles are used to construct the query for another MRC model for argument labeling. In this way, we are able to leverage both the predicate semantics and the semantic role semantics for argument labeling. We also propose to select a subset of all the possible semantic roles for computational efficiency. Experiments show that the proposed framework achieves state-of-the-art or comparable results to previous work.
2020
Improving Entity Linking through Semantic Reinforced Entity Embeddings
Feng Hou
|
Ruili Wang
|
Jun He
|
Yi Zhou
Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics
Entity embeddings, which represent different aspects of each entity with a single vector like word embeddings, are a key component of neural entity linking models. Existing entity embeddings are learned from canonical Wikipedia articles and local contexts surrounding target entities. Such entity embeddings are effective, but too distinctive for linking models to learn contextual commonality. We propose a simple yet effective method, FGS2EE, to inject fine-grained semantic information into entity embeddings to reduce the distinctiveness and facilitate the learning of contextual commonality. FGS2EE first uses the embeddings of semantic type words to generate semantic embeddings, and then combines them with existing entity embeddings through linear aggregation. Extensive experiments show the effectiveness of such embeddings. Based on our entity embeddings, we achieved new sate-of-the-art performance on entity linking.
1997
Learning New Compositions from Given Ones
Donghong Ji
|
Jun He
|
Changning Huang
CoNLL97: Computational Natural Language Learning
Search
Co-authors
- Feng Hou 1
- Ruili Wang 1
- Yi Zhou 1
- Donghong Ji 1
- Changning Huang 1
- show all...