Content area

Abstract

Conventional neural architecture search (NAS) algorithms typically work on search spaces with short-distance node connections. We argue that such designs, though safe and stable, are obstacles to exploring more effective network architectures. In this brief, we explore the search algorithm upon a complicated search space with long-distance connections and show that existing weight-sharing search algorithms fail due to the existence of interleaved connections (ICs). Based on the observation, we present a simple-yet-effective algorithm, termed interleaving-free neural architecture search (IF-NAS). We further design a periodic sampling strategy to construct subnetworks during the search procedure, avoiding the ICs to emerge in any of them. In the proposed search space, IF-NAS outperforms both random sampling and previous weight-sharing search algorithms by significant margins. It can also be well-generalized to the microcell-based spaces. This study emphasizes the importance of macrostructure and we look forward to further efforts in this direction. The code is available at github.com/sunsmarterjie/IFNAS.Conventional neural architecture search (NAS) algorithms typically work on search spaces with short-distance node connections. We argue that such designs, though safe and stable, are obstacles to exploring more effective network architectures. In this brief, we explore the search algorithm upon a complicated search space with long-distance connections and show that existing weight-sharing search algorithms fail due to the existence of interleaved connections (ICs). Based on the observation, we present a simple-yet-effective algorithm, termed interleaving-free neural architecture search (IF-NAS). We further design a periodic sampling strategy to construct subnetworks during the search procedure, avoiding the ICs to emerge in any of them. In the proposed search space, IF-NAS outperforms both random sampling and previous weight-sharing search algorithms by significant margins. It can also be well-generalized to the microcell-based spaces. This study emphasizes the importance of macrostructure and we look forward to further efforts in this direction. The code is available at github.com/sunsmarterjie/IFNAS.

Details

1007527
Journal classification
Title
Exploring Complicated Search Spaces With Interleaving-Free Sampling
Journal abbreviation
IEEE Trans Neural Netw Learn Syst
Volume
36
Issue
4
Pages
7764-7771
Publication year
2025
Country of publication
UNITED STATES
eISSN
2162-2388
Source type
Scholarly Journal
Format availability
Internet
Language of publication
English
Record type
Journal Article
Publication history
 
 
Online publication date
2025-04-04
Publication note
Print-Electronic
Publication history
 
 
   First posting date
04 Apr 2025
   Revised date
07 Apr 2025
07 Apr 2025
   First submitted date
18 Jul 2024
Medline document status
PubMed-not-MEDLINE
Electronic publication date
2025-04-04
PubMed ID
39024083
ProQuest document ID
3082625223
Document URL
https://www.proquest.com/scholarly-journals/exploring-complicated-search-spaces-with/docview/3082625223/se-2?accountid=208611
Last updated
2025-04-07
Database
ProQuest One Academic