2019
pdf
bib
abs
Learning Relational Representations by Analogy using Hierarchical Siamese Networks
Gaetano Rossiello
|
Alfio Gliozzo
|
Robert Farrell
|
Nicolas Fauceglia
|
Michael Glass
Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers)
We address relation extraction as an analogy problem by proposing a novel approach to learn representations of relations expressed by their textual mentions. In our assumption, if two pairs of entities belong to the same relation, then those two pairs are analogous. Following this idea, we collect a large set of analogous pairs by matching triples in knowledge bases with web-scale corpora through distant supervision. We leverage this dataset to train a hierarchical siamese network in order to learn entity-entity embeddings which encode relational information through the different linguistic paraphrasing expressing the same relation. We evaluate our model in a one-shot learning task by showing a promising generalization capability in order to classify unseen relation types, which makes this approach suitable to perform automatic knowledge base population with minimal supervision. Moreover, the model can be used to generate pre-trained embeddings which provide a valuable signal when integrated into an existing neural-based model by outperforming the state-of-the-art methods on a downstream relation extraction task.
2016
pdf
bib
abs
Joint Learning of Local and Global Features for Entity Linking via Neural Networks
Thien Huu Nguyen
|
Nicolas Fauceglia
|
Mariano Rodriguez Muro
|
Oktie Hassanzadeh
|
Alfio Massimiliano Gliozzo
|
Mohammad Sadoghi
Proceedings of COLING 2016, the 26th International Conference on Computational Linguistics: Technical Papers
Previous studies have highlighted the necessity for entity linking systems to capture the local entity-mention similarities and the global topical coherence. We introduce a novel framework based on convolutional neural networks and recurrent neural networks to simultaneously model the local and global features for entity linking. The proposed model benefits from the capacity of convolutional neural networks to induce the underlying representations for local contexts and the advantage of recurrent neural networks to adaptively compress variable length sequences of predictions for global constraints. Our evaluation on multiple datasets demonstrates the effectiveness of the model and yields the state-of-the-art performance on such datasets. In addition, we examine the entity linking systems on the domain adaptation setting that further demonstrates the cross-domain robustness of the proposed model.
2015
pdf
bib
Word Sense Disambiguation via PropStore and OntoNotes for Event Mention Detection
Nicolas R. Fauceglia
|
Yiu-Chang Lin
|
Xuezhe Ma
|
Eduard Hovy
Proceedings of the 3rd Workshop on EVENTS: Definition, Detection, Coreference, and Representation