Neural Architecture Search (NAS) has revolutionized network design by automating the search for optimal architectures, albeit often requiring substantial computational resources. This paper introduces a novel approach to mitigate this challenge by strategically shuffling channel depth and masking less critical channels during the search process. Subsequently, the discovered backbone is migrated to an object detection network and fine-tuned, circumventing the need for training from scratch. This innovative method yields competitive results in terms of mean Average Precision (mAP), parameters, and Floating Point Operations Per Second (FLOPs), aligning with conventional hand-designed methods and existing NAS networks.
Zhenshan BaoQian ZhaoWenbo ZhangYilong Ding
Egor PoliakovWei-Jie HungChing-Chun Huang
Chuanyou LiYifan LiHuanyun HuJiangwei ShangKun ZhangLei QianKexiang Wang