%0 Conference Proceedings %T A Discourse-Aware Attention Model for Abstractive Summarization of Long Documents %A Cohan, Arman %A Dernoncourt, Franck %A Kim, Doo Soon %A Bui, Trung %A Kim, Seokhwan %A Chang, Walter %A Goharian, Nazli %Y Walker, Marilyn %Y Ji, Heng %Y Stent, Amanda %S Proceedings of the 2018 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 2 (Short Papers) %D 2018 %8 June %I Association for Computational Linguistics %C New Orleans, Louisiana %F cohan-etal-2018-discourse %X Neural abstractive summarization models have led to promising results in summarizing relatively short documents. We propose the first model for abstractive summarization of single, longer-form documents (e.g., research papers). Our approach consists of a new hierarchical encoder that models the discourse structure of a document, and an attentive discourse-aware decoder to generate the summary. Empirical results on two large-scale datasets of scientific papers show that our model significantly outperforms state-of-the-art models. %R 10.18653/v1/N18-2097 %U https://aclanthology.org/N18-2097 %U https://doi.org/10.18653/v1/N18-2097 %P 615-621