JOURNAL ARTICLE

Efficient Progressive Neural Architecture Search

Abstract

This paper addresses the difficult problem of finding an optimal neural architecture design for a given image classification task. We propose a method that aggregates two main results of the previous state-of-the-art in neural architecture search. These are, appealing to the strong sampling efficiency of a search scheme based on sequential model-based optimization (SMBO), and increasing training efficiency by sharing weights among sampled architectures. Sequential search has previously demonstrated its capabilities to find state-of-the-art neural architectures for image classification. However, its computational cost remains high, even unreachable under modest computational settings. Affording SMBO with weight-sharing alleviates this problem. On the other hand, progressive search with SMBO is inherently greedy, as it leverages a learned surrogate function to predict the validation error of neural architectures. This prediction is directly used to rank the sampled neural architectures. We propose to attenuate the greediness of the original SMBO method by relaxing the role of the surrogate function so it predicts architecture sampling probability instead. We demonstrate with experiments on the CIFAR-10 dataset that our method, denominated Efficient progressive neural architecture search (EPNAS), leads to increased search efficiency, while retaining competitiveness of found architectures.

Keywords:
Computer science Architecture Artificial intelligence Artificial neural network Machine learning Task (project management) Function (biology) Pattern recognition (psychology) Engineering

Metrics

13
Cited By
0.00
FWCI (Field Weighted Citation Impact)
0
Refs
Citation Normalized Percentile
Is in top 1%
Is in top 10%

Citation History

Topics

Advanced Neural Network Applications
Physical Sciences →  Computer Science →  Computer Vision and Pattern Recognition
Machine Learning and Data Classification
Physical Sciences →  Computer Science →  Artificial Intelligence
Domain Adaptation and Few-Shot Learning
Physical Sciences →  Computer Science →  Artificial Intelligence

Related Documents

JOURNAL ARTICLE

PSP: Progressive Space Pruning for Efficient Graph Neural Architecture Search

Guanghui ZhuWenjie WangZhuoer XuFeng ChengMengchuan QiuChunfeng YuanYihua Huang

Journal:   2022 IEEE 38th International Conference on Data Engineering (ICDE) Year: 2022 Pages: 2168-2181
JOURNAL ARTICLE

Pareto-optimal progressive neural architecture search

Eugenio LomurnoStefano SameleMatteo MatteucciDanilo Ardagna

Journal:   Proceedings of the Genetic and Evolutionary Computation Conference Companion Year: 2021 Pages: 1726-1734
JOURNAL ARTICLE

Neural architecture search using progressive evolution

Nilotpal SinhaKuan‐Wen Chen

Journal:   Proceedings of the Genetic and Evolutionary Computation Conference Year: 2022 Pages: 1093-1101
© 2026 ScienceGate Book Chapters — All rights reserved.