%0 Conference Proceedings %T Improving Neural Abstractive Document Summarization with Structural Regularization %A Li, Wei %A Xiao, Xinyan %A Lyu, Yajuan %A Wang, Yuanzhuo %Y Riloff, Ellen %Y Chiang, David %Y Hockenmaier, Julia %Y Tsujii, Jun’ichi %S Proceedings of the 2018 Conference on Empirical Methods in Natural Language Processing %D 2018 %8 oct nov %I Association for Computational Linguistics %C Brussels, Belgium %F li-etal-2018-improving-neural %X Recent neural sequence-to-sequence models have shown significant progress on short text summarization. However, for document summarization, they fail to capture the long-term structure of both documents and multi-sentence summaries, resulting in information loss and repetitions. In this paper, we propose to leverage the structural information of both documents and multi-sentence summaries to improve the document summarization performance. Specifically, we import both structural-compression and structural-coverage regularization into the summarization process in order to capture the information compression and information coverage properties, which are the two most important structural properties of document summarization. Experimental results demonstrate that the structural regularization improves the document summarization performance significantly, which enables our model to generate more informative and concise summaries, and thus significantly outperforms state-of-the-art neural abstractive methods. %R 10.18653/v1/D18-1441 %U https://aclanthology.org/D18-1441 %U https://doi.org/10.18653/v1/D18-1441 %P 4078-4087