RankNAS: Efficient Neural Architecture Search by Pairwise Ranking

Chi Hu, Chenglong Wang, Xiangnan Ma, Xia Meng, Yinqiao Li, Tong Xiao, Jingbo Zhu, Changliang Li


Abstract
This paper addresses the efficiency challenge of Neural Architecture Search (NAS) by formulating the task as a ranking problem. Previous methods require numerous training examples to estimate the accurate performance of architectures, although the actual goal is to find the distinction between “good” and “bad” candidates. Here we do not resort to performance predictors. Instead, we propose a performance ranking method (RankNAS) via pairwise ranking. It enables efficient architecture search using much fewer training examples. Moreover, we develop an architecture selection method to prune the search space and concentrate on more promising candidates. Extensive experiments on machine translation and language modeling tasks show that RankNAS can design high-performance architectures while being orders of magnitude faster than state-of-the-art NAS systems.
Anthology ID:
2021.emnlp-main.191
Volume:
Proceedings of the 2021 Conference on Empirical Methods in Natural Language Processing
Month:
November
Year:
2021
Address:
Online and Punta Cana, Dominican Republic
Editors:
Marie-Francine Moens, Xuanjing Huang, Lucia Specia, Scott Wen-tau Yih
Venue:
EMNLP
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
2469–2480
Language:
URL:
https://aclanthology.org/2021.emnlp-main.191
DOI:
10.18653/v1/2021.emnlp-main.191
Bibkey:
Cite (ACL):
Chi Hu, Chenglong Wang, Xiangnan Ma, Xia Meng, Yinqiao Li, Tong Xiao, Jingbo Zhu, and Changliang Li. 2021. RankNAS: Efficient Neural Architecture Search by Pairwise Ranking. In Proceedings of the 2021 Conference on Empirical Methods in Natural Language Processing, pages 2469–2480, Online and Punta Cana, Dominican Republic. Association for Computational Linguistics.
Cite (Informal):
RankNAS: Efficient Neural Architecture Search by Pairwise Ranking (Hu et al., EMNLP 2021)
Copy Citation:
PDF:
https://aclanthology.org/2021.emnlp-main.191.pdf
Data
WikiText-103WikiText-2