Abstrakt: |
In differentiable search architecture searchmethods, amore efficient search space design can significantly improve the performance of the searched architecture, thus requiring people to carefully define the search space with different complexity according to various operations. Meanwhile rationalizing the search strategies to explore the well-defined search space will further improve the speed and efficiency of architecture search. With this in mind, we propose a faster andmore efficient differentiable architecture searchmethod, AllegroNAS. Firstly, we introduce a more efficient search space enriched by the introduction of two redefined convolution modules. Secondly, we utilize a more efficient architectural parameter regularization method, mitigating the overfitting problem during the search process and reducing the error brought about by gradient approximation. Meanwhile, we introduce a natural exponential cosine annealingmethod tomake the learning rate of the neural network training processmore suitable for the search procedure. Moreover, group convolution and data augmentation are employed to reduce the computational cost. Finally, through extensive experiments on several public datasets, we demonstrate that our method canmore swiftly search for better-performing neural network architectures in amore efficient search space, thus validating the effectiveness of our approach. [ABSTRACT FROM AUTHOR] |