Enhanced Gradient for Differentiable Architecture Search.

Autor: Zhang H, Hao K, Gao L, Tang XS, Wei B
Jazyk: angličtina
Zdroj: IEEE transactions on neural networks and learning systems [IEEE Trans Neural Netw Learn Syst] 2024 Jul; Vol. 35 (7), pp. 9606-9620. Date of Electronic Publication: 2024 Jul 08.
DOI: 10.1109/TNNLS.2023.3235479
Abstrakt: In recent years, neural architecture search (NAS) methods have been proposed for the automatic generation of task-oriented network architecture in image classification. However, the architectures obtained by existing NAS approaches are optimized only for classification performance and do not adapt to devices with limited computational resources. To address this challenge, we propose a neural network architecture search algorithm aiming to simultaneously improve the network performance and reduce the network complexity. The proposed framework automatically builds the network architecture at two stages: block-level search and network-level search. At the stage of block-level search, a gradient-based relaxation method is proposed, using an enhanced gradient to design high-performance and low-complexity blocks. At the stage of network-level search, an evolutionary multiobjective algorithm is utilized to complete the automatic design from blocks to the target network. The experimental results demonstrate that our method outperforms all evaluated hand-crafted networks in image classification, with an error rate of 3.18% on Canadian Institute for Advanced Research (CIFAR10) and an error rate of 19.16% on CIFAR100, both at network parameter size less than 1 M. Obviously, compared with other NAS methods, our method offers a tremendous reduction in designed network architecture parameters.
Databáze: MEDLINE