Knowledge Distillation Based on Fitting Ground-Truth Distribution of Images

Autor: Jianze Li, Zhenhua Tang, Kai Chen, Zhenlei Cui
Jazyk: angličtina
Rok vydání: 2024
Předmět:
Zdroj: Applied Sciences, Vol 14, Iss 8, p 3284 (2024)
Druh dokumentu: article
ISSN: 2076-3417
DOI: 10.3390/app14083284
Popis: Knowledge distillation based on the features from the penultimate layer allows the student (lightweight model) to efficiently mimic the internal feature outputs of the teacher (high-capacity model). However, the training data may not conform to the ground-truth distribution of images in terms of classes and features. We propose two knowledge distillation algorithms to solve the above problem from the directions of fitting the ground-truth distribution of classes and fitting the ground-truth distribution of features, respectively. The former uses teacher labels to supervise student classification output instead of dataset labels, while the latter designs feature temperature parameters to correct teachers’ abnormal feature distribution output. We conducted knowledge distillation experiments on the ImageNet-2012 and Cifar-100 datasets using seven sets of homogeneous models and six sets of heterogeneous models. The experimental results show that our proposed algorithms improve the performance of penultimate layer feature knowledge distillation and outperform other existing knowledge distillation methods in terms of classification performance and generalization ability.
Databáze: Directory of Open Access Journals