%0 Conference Proceedings %T Learning Non-Autoregressive Models from Search for Unsupervised Sentence Summarization %A Liu, Puyuan %A Huang, Chenyang %A Mou, Lili %Y Muresan, Smaranda %Y Nakov, Preslav %Y Villavicencio, Aline %S Proceedings of the 60th Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers) %D 2022 %8 May %I Association for Computational Linguistics %C Dublin, Ireland %F liu-etal-2022-learning %X Text summarization aims to generate a short summary for an input text. In this work, we propose a Non-Autoregressive Unsupervised Summarization (NAUS) approach, which does not require parallel data for training. Our NAUS first performs edit-based search towards a heuristically defined score, and generates a summary as pseudo-groundtruth. Then, we train an encoder-only non-autoregressive Transformer based on the search result. We also propose a dynamic programming approach for length-control decoding, which is important for the summarization task. Experiments on two datasets show that NAUS achieves state-of-the-art performance for unsupervised summarization, yet largely improving inference efficiency. Further, our algorithm is able to perform explicit length-transfer summary generation. %R 10.18653/v1/2022.acl-long.545 %U https://aclanthology.org/2022.acl-long.545 %U https://doi.org/10.18653/v1/2022.acl-long.545 %P 7916-7929