Zobrazeno 1 - 3
of 3
pro vyhledávání: '"Wen, Shengzhao"'
Knowledge distillation is an effective method for model compression. However, it is still a challenging topic to apply knowledge distillation to detection tasks. There are two key points resulting in poor distillation performance for detection tasks.
Externí odkaz:
http://arxiv.org/abs/2302.05637
Autor:
Wang, Yu, Li, Xin, Wen, Shengzhao, Yang, Fukui, Zhang, Wanping, Zhang, Gang, Feng, Haocheng, Han, Junyu, Ding, Errui
DETR is a novel end-to-end transformer architecture object detector, which significantly outperforms classic detectors when scaling up the model size. In this paper, we focus on the compression of DETR with knowledge distillation. While knowledge dis
Externí odkaz:
http://arxiv.org/abs/2211.08071
Publikováno v:
CVPR
Neural architecture search (NAS) advances beyond the state-of-the-art in various computer vision tasks by automating the designs of deep neural networks. In this paper, we aim to address three important questions in NAS: (1) How to measure the correl