Autor: |
Yoshihide Sawada, Yoshikuni Sato, Toru Nakada, Shunta Yamaguchi, Kei Ujimoto, Nobuhiro Hayashi |
Jazyk: |
angličtina |
Rok vydání: |
2019 |
Předmět: |
|
Zdroj: |
Applied Sciences, Vol 9, Iss 1, p 128 (2019) |
Druh dokumentu: |
article |
ISSN: |
2076-3417 |
DOI: |
10.3390/app9010128 |
Popis: |
This paper proposes a target vector modification method for the all-transfer deep learning (ATDL) method. Deep neural networks (DNNs) have been used widely in many applications; however, the DNN has been known to be problematic when large amounts of training data are not available. Transfer learning can provide a solution to this problem. Previous methods regularize all layers, including the output layer, by estimating the relation vectors, which are then used instead of one-hot target vectors of the target domain. These vectors are estimated by averaging the target domain data of each target domain label in the output space. This method improves the classification performance, but it does not consider the relation between the relation vectors. From this point of view, we propose a relation vector modification based on constrained pairwise repulsive forces. High pairwise repulsive forces provide large distances between the relation vectors. In addition, the risk of divergence is mitigated by the constraint based on distributions of the output vectors of the target domain data. We apply our method to two simulation experiments and a disease classification using two-dimensional electrophoresis images. The experimental results show that reusing all layers through our estimation method is effective, especially for a significantly small number of the target domain data. |
Databáze: |
Directory of Open Access Journals |
Externí odkaz: |
|