Transfer learning with deep convolutional neural network for constitution classification with face image
Autor: | Guihua Wen, Er-Yang Huan |
---|---|
Rok vydání: | 2020 |
Předmět: |
Contextual image classification
Computer Networks and Communications Computer science Constitution business.industry media_common.quotation_subject 020207 software engineering 02 engineering and technology Machine learning computer.software_genre Convolutional neural network Field (computer science) Identification (information) Hardware and Architecture Face (geometry) 0202 electrical engineering electronic engineering information engineering Media Technology Artificial intelligence Transfer of learning Constitution type business computer Software media_common |
Zdroj: | Multimedia Tools and Applications. 79:11905-11919 |
ISSN: | 1573-7721 1380-7501 |
DOI: | 10.1007/s11042-019-08376-5 |
Popis: | Constitution classification is the basis and core content of constitution research in Traditional Chinese medicine. The convolutional neural networks have successfully established many models for image classification, but it requires a lot of training data. In the field of Traditional Chinese medicine, the available clinical data is very limited. To solve this problem, we propose a method for constitution classification through transfer learning. Firstly, the DenseNet-169 model trained in ImageNet is applied. Secondly, we carefully modify the DenseNet-169 structure according to the constitution characteristics, and then the modified model is trained in the clinical data to obtain the constitution identification network called ConstitutionNet. In order to further improve the accuracy of classification, we integrate the ConstitutionNet with Vgg-16, Inception v3 and DenseNet-121 to test according to the integrated learning idea, and judge the input face image to its constitution type. The experimental results show that transfer learning can achieve better results in small clinical dataset, and the final accuracy of constitution recognition is 66.79%. |
Databáze: | OpenAIRE |
Externí odkaz: |