Reham Osama
2019
Question Answering Using Hierarchical Attention on Top of BERT Features
Reham Osama
|
Nagwa El-Makky
|
Marwan Torki
Proceedings of the 2nd Workshop on Machine Reading for Question Answering
The model submitted works as follows. When supplied a question and a passage it makes use of the BERT embedding along with the hierarchical attention model which consists of 2 parts, the co-attention and the self-attention, to locate a continuous span of the passage that is the answer to the question.