Zobrazeno 1 - 5
of 5
pro vyhledávání: '"Gurbuz, Yeti Z."'
Typical technique in knowledge distillation (KD) is regularizing the learning of a limited capacity model (student) by pushing its responses to match a powerful model's (teacher). Albeit useful especially in the penultimate layer and beyond, its acti
Externí odkaz:
http://arxiv.org/abs/2309.02843
A common architectural choice for deep metric learning is a convolutional neural network followed by global average pooling (GAP). Albeit simple, GAP is a highly effective way to aggregate information. One possible explanation for the effectiveness o
Externí odkaz:
http://arxiv.org/abs/2308.09228
Autor:
Gurbuz, Yeti Z., Alatan, A. Aydin
Global average pooling (GAP) is a popular component in deep metric learning (DML) for aggregating features. Its effectiveness is often attributed to treating each feature vector as a distinct semantic entity and GAP as a combination of them. Albeit s
Externí odkaz:
http://arxiv.org/abs/2307.07620
Convolution blocks serve as local feature extractors and are the key to success of the neural networks. To make local semantic feature embedding rather explicit, we reformulate convolution blocks as feature selection according to the best matching ke
Externí odkaz:
http://arxiv.org/abs/2210.00992
Deep metric learning (DML) aims to minimize empirical expected loss of the pairwise intra-/inter- class proximity violations in the embedding space. We relate DML to feasibility problem of finite chance constraints. We show that minimizer of proxy-ba
Externí odkaz:
http://arxiv.org/abs/2209.09060