Olof Mogren


pdf bib
Character-based recurrent neural networks for morphological relational reasoning
Olof Mogren | Richard Johansson
Proceedings of the First Workshop on Subword and Character Level Models in NLP

We present a model for predicting word forms based on morphological relational reasoning with analogies. While previous work has explored tasks such as morphological inflection and reinflection, these models rely on an explicit enumeration of morphological features, which may not be available in all cases. To address the task of predicting a word form given a demo relation (a pair of word forms) and a query word, we devise a character-based recurrent neural network architecture using three separate encoders and a decoder. We also investigate a multiclass learning setup, where the prediction of the relation type label is used as an auxiliary task. Our results show that the exact form can be predicted for English with an accuracy of 94.7%. For Swedish, which has a more complex morphology with more inflectional patterns for nouns and verbs, the accuracy is 89.3%. We also show that using the auxiliary task of learning the relation type speeds up convergence and improves the prediction accuracy for the word generation task.


pdf bib
Assisting Discussion Forum Users using Deep Recurrent Neural Networks
Jacob Hagstedt P Suorra | Olof Mogren
Proceedings of the 1st Workshop on Representation Learning for NLP

pdf bib
Named Entity Recognition in Swedish Health Records with Character-Based Deep Bidirectional LSTMs
Simon Almgren | Sean Pavlov | Olof Mogren
Proceedings of the Fifth Workshop on Building and Evaluating Resources for Biomedical Text Mining (BioTxtM2016)

We propose an approach for named entity recognition in medical data, using a character-based deep bidirectional recurrent neural network. Such models can learn features and patterns based on the character sequence, and are not limited to a fixed vocabulary. This makes them very well suited for the NER task in the medical domain. Our experimental evaluation shows promising results, with a 60% improvement in F 1 score over the baseline, and our system generalizes well between different datasets.


pdf bib
Extractive Summarization by Aggregating Multiple Similarities
Olof Mogren | Mikael Kågebäck | Devdatt Dubhashi
Proceedings of the International Conference Recent Advances in Natural Language Processing


pdf bib
Extractive Summarization using Continuous Vector Space Models
Mikael Kågebäck | Olof Mogren | Nina Tahmasebi | Devdatt Dubhashi
Proceedings of the 2nd Workshop on Continuous Vector Space Models and their Compositionality (CVSC)