Discover the Effective Strategy for Face Recognition Model Compression by Improved Knowledge Distillation
Autor: | Rujie Liu, Shigefumi Yamada, Narishige Abe, Mengjiao Wang, Hidetsugu Uchida, Tomoaki Matsunami |
---|---|
Rok vydání: | 2018 |
Předmět: |
Normalization (statistics)
Computer science business.industry Cosine similarity 02 engineering and technology 010501 environmental sciences Machine learning computer.software_genre 01 natural sciences Facial recognition system Weighting Metric (mathematics) 0202 electrical engineering electronic engineering information engineering Feature (machine learning) 020201 artificial intelligence & image processing Artificial intelligence Layer (object-oriented design) business computer 0105 earth and related environmental sciences |
Zdroj: | ICIP |
DOI: | 10.1109/icip.2018.8451808 |
Popis: | For the sake of better accuracy, the face recognition model is becoming larger and larger, which makes them difficult to be deployed on embedded systems. This work proposes an effective model compression method using knowledge distillation, where a fast student model is trained under the guidance of a complex teacher model. Firstly, different loss combinations and network architectures are analyzed through comprehensive experiments to find the most effective approach. To augment the performance, the feature layer is further normalized to make the optimization objective consistent with cosine similarity metric. Moreover, a teacher weighting strategy is proposed to address the issue when teacher provides wrong guidance. Experimental results show that the student model built by our approach can surpass the teacher model while achieving 3× acceleration. |
Databáze: | OpenAIRE |
Externí odkaz: |