JOURNAL ARTICLE

Gradient-Based Neural Architecture Search: A Comprehensive Evaluation

Sarwat AliM. Arif Wani

Year: 2023 Journal:   Machine Learning and Knowledge Extraction Vol: 5 (3)Pages: 1176-1194   Publisher: Multidisciplinary Digital Publishing Institute

Abstract

One of the challenges in deep learning involves discovering the optimal architecture for a specific task. This is effectively tackled through Neural Architecture Search (NAS). Neural Architecture Search encompasses three prominent approaches—reinforcement learning, evolutionary algorithms, and gradient descent—that have demonstrated noteworthy potential in identifying good candidate architectures. However, approaches based on reinforcement learning and evolutionary algorithms often necessitate extensive computational resources, requiring hundreds of GPU days or more. Therefore, we confine this work to a gradient-based approach due to its lower computational resource demands. Our objective encompasses identifying the optimal gradient-based NAS method and pinpointing opportunities for future enhancements. To achieve this, a comprehensive evaluation of the use of four major Gradient descent-based architecture search methods for discovering the best neural architecture for image classification tasks is provided. An overview of these gradient-based methods, i.e., DARTS, PDARTS, Fair DARTS and Att-DARTS, is presented. A theoretical comparison, based on search spaces, continuous relaxation strategy and bi-level optimization, for deriving the best neural architecture is then provided. The strong and weak features of these methods are also listed. Experimental results for comparing the error rate and computational cost of these gradient-based methods are analyzed. These experiments involved using bench marking datasets CIFAR-10, CIFAR-100 and ImageNet. The results show that PDARTS is better and faster among the examined methods, making it a potent candidate for automating Neural Architecture Search. By effectively conducting a comparative analysis, our research provides valuable insights and future research directions to address the criticism and gaps in the literature.

Keywords:
Computer science Reinforcement learning Artificial intelligence Artificial neural network Architecture Gradient descent Machine learning Resource (disambiguation) Deep learning

Metrics

7
Cited By
1.27
FWCI (Field Weighted Citation Impact)
24
Refs
0.77
Citation Normalized Percentile
Is in top 1%
Is in top 10%

Citation History

Topics

Advanced Neural Network Applications
Physical Sciences →  Computer Science →  Computer Vision and Pattern Recognition
Machine Learning and Data Classification
Physical Sciences →  Computer Science →  Artificial Intelligence
Reinforcement Learning in Robotics
Physical Sciences →  Computer Science →  Artificial Intelligence

Related Documents

BOOK-CHAPTER

Spatial Steganalysis Based on Gradient-Based Neural Architecture Search

Xiaoqing DengWeiqi LuoYanmei Fang

Lecture notes in computer science Year: 2021 Pages: 365-375
JOURNAL ARTICLE

A Gradient-Guided Evolutionary Neural Architecture Search

Yu XueXiaolong HanFerrante NeriJiafeng QinDanilo Pelusi

Journal:   IEEE Transactions on Neural Networks and Learning Systems Year: 2024 Vol: 36 (3)Pages: 4345-4357
© 2026 ScienceGate Book Chapters — All rights reserved.