JOURNAL ARTICLE

Neural architecture search using property guided synthesis

Charles JinPhitchaya Mangpo PhothilimthanaSudip Roy

Year: 2022 Journal:   Proceedings of the ACM on Programming Languages Vol: 6 (OOPSLA2)Pages: 1150-1179   Publisher: Association for Computing Machinery

Abstract

Neural architecture search (NAS) has become an increasingly important tool within the deep learning community in recent years, yielding many practical advancements in the design of deep neural network architectures. However, most existing approaches operate within highly structured design spaces, and hence (1) explore only a small fraction of the full search space of neural architectures while also (2) requiring significant manual effort from domain experts. In this work, we develop techniques that enable efficient NAS in a significantly larger design space. In particular, we propose to perform NAS in an abstract search space of program properties. Our key insights are as follows: (1) an abstract search space can be significantly smaller than the original search space, and (2) architectures with similar program properties should also have similar performance; thus, we can search more efficiently in the abstract search space. To enable this approach, we also introduce a novel efficient synthesis procedure, which performs the role of concretizing a set of promising program properties into a satisfying neural architecture. We implement our approach, αNAS, within an evolutionary framework, where the mutations are guided by the program properties. Starting with a ResNet-34 model, αNAS produces a model with slightly improved accuracy on CIFAR-10 but 96% fewer parameters. On ImageNet, αNAS is able to improve over Vision Transformer (30% fewer FLOPS and parameters), ResNet-50 (23% fewer FLOPS, 14% fewer parameters), and EfficientNet (7% fewer FLOPS and parameters) without any degradation in accuracy.

Keywords:
FLOPS Computer science Architecture Artificial neural network Artificial intelligence Machine learning Set (abstract data type) Space (punctuation) Fraction (chemistry) Key (lock) Computer engineering Parallel computing

Metrics

7
Cited By
0.87
FWCI (Field Weighted Citation Impact)
47
Refs
0.71
Citation Normalized Percentile
Is in top 1%
Is in top 10%

Citation History

Topics

Advanced Neural Network Applications
Physical Sciences →  Computer Science →  Computer Vision and Pattern Recognition
Domain Adaptation and Few-Shot Learning
Physical Sciences →  Computer Science →  Artificial Intelligence
Adversarial Robustness in Machine Learning
Physical Sciences →  Computer Science →  Artificial Intelligence

Related Documents

JOURNAL ARTICLE

EQUINAS: Equilibrium-guided differentiable neural architecture search

Weisheng XieXiangxiang GaoXuwei FangHui LiChen HangShaoyuan Li

Journal:   Expert Systems with Applications Year: 2025 Vol: 298 Pages: 129711-129711
JOURNAL ARTICLE

Efficient guided evolution for neural architecture search

Vasco LopesMiguel SantosBruno DegardinLuı́s A. Alexandre

Journal:   Proceedings of the Genetic and Evolutionary Computation Conference Companion Year: 2022 Pages: 655-658
JOURNAL ARTICLE

A Gradient-Guided Evolutionary Neural Architecture Search

Yu XueXiaolong HanFerrante NeriJiafeng QinDanilo Pelusi

Journal:   IEEE Transactions on Neural Networks and Learning Systems Year: 2024 Vol: 36 (3)Pages: 4345-4357
JOURNAL ARTICLE

NPENAS: Neural Predictor Guided Evolution for Neural Architecture Search

Chen WeiChuang NiuYiping TangYue WangHaihong HuJimin Liang

Journal:   IEEE Transactions on Neural Networks and Learning Systems Year: 2022 Vol: 34 (11)Pages: 8441-8455
© 2026 ScienceGate Book Chapters — All rights reserved.