The intensive consumption of resources by evolutionary algorithms makes it very time-consuming to search for network architectures. In this paper, We proposed a efficient evolution method for neural architecture search. Our method adopts the weight sharing strategy, in which a supernet is built to subsume all architectures, to speed up architecture evaluation. A universal choice strategy is designed to deal with the inaccurate evaluation caused by the methods that speeding up evaluation. Instead of searching for the best architecture, we search for the set of excellent architectures and derive the final architecture from derive the target architecture according to commonalities of these architectures. The proposed method achieved better results(2.40% test error rate on CIFAR-10 with 3.66M parameters) compared to other the-state-of-art method using less than 0.4 GPU days.
Vasco LopesMiguel SantosBruno DegardinLuı́s A. Alexandre
Zhaohui YangYunhe WangXinghao ChenBoxin ShiChao XuChunjing XuQi TianChang Xu
Juan-Manuel Pérez-RúaMoez BaccoucheStéphane Pateux
Shahid SiddiquiChristos KyrkouTakehiro Sasaki