Abstrakt: |
The construction of a deep neural network for a particular task often requires substantial architecture engineering and access to large-scale datasets. Transfer learning, which enables the utilization of pre-trained optimal architectures trained on large-scale datasets, has proven to be an efficient technique for addressing these two challenges. However, the use of pre-trained models involves a special adaptation to the target task, which typically includes either freezing or fine-tuning specific layers of the pre-trained model. In this paper, we introduce a novel optimization model for the selection of frozen and fine-tuned layers, aiming to enhance the efficacy of transfer learning, where the impact of each source layer is regulated within the target network through the utilization of binary decision variables. The derived optimization model is resolved utilizing Evolutionary Genetic Algorithms. Several experiments confirm the efficacy of our model in identifying both frozen and fine-tuned layers, thereby improving data classification. Subsequently, the results obtained are compared with those of state-of-the-art transfer learning methods for comprehensive comparison. |