Red Dragon AI at TextGraphs 2019 Shared Task: Language Model Assisted Explanation Generation

Yew Ken Chia, Sam Witteveen, Martin Andrews


Abstract
The TextGraphs-13 Shared Task on Explanation Regeneration (Jansen and Ustalov, 2019) asked participants to develop methods to reconstruct gold explanations for elementary science questions. Red Dragon AI’s entries used the language of the questions and explanation text directly, rather than a constructing a separate graph-like representation. Our leaderboard submission placed us 3rd in the competition, but we present here three methods of increasing sophistication, each of which scored successively higher on the test set after the competition close.
Anthology ID:
D19-5311
Volume:
Proceedings of the Thirteenth Workshop on Graph-Based Methods for Natural Language Processing (TextGraphs-13)
Month:
November
Year:
2019
Address:
Hong Kong
Venues:
EMNLP | TextGraphs | WS
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
85–89
Language:
URL:
https://aclanthology.org/D19-5311
DOI:
10.18653/v1/D19-5311
Bibkey:
Cite (ACL):
Yew Ken Chia, Sam Witteveen, and Martin Andrews. 2019. Red Dragon AI at TextGraphs 2019 Shared Task: Language Model Assisted Explanation Generation. In Proceedings of the Thirteenth Workshop on Graph-Based Methods for Natural Language Processing (TextGraphs-13), pages 85–89, Hong Kong. Association for Computational Linguistics.
Cite (Informal):
Red Dragon AI at TextGraphs 2019 Shared Task: Language Model Assisted Explanation Generation (Chia et al., EMNLP 2019)
Copy Citation:
PDF:
https://aclanthology.org/D19-5311.pdf
Code
 mdda/worldtree_corpus