Zobrazeno 1 - 6
of 6
pro vyhledávání: '"Wonpyo Park"'
Publikováno v:
Entropy, Vol 24, Iss 4, p 501 (2022)
For high-dimensional data such as images, learning an encoder that can output a compact yet informative representation is a key task on its own, in addition to facilitating subsequent processing of data. We present a model that produces discrete info
Externí odkaz:
https://doaj.org/article/6e80c1b35b2f4b5fa42f31f1d463db96
Autor:
Yonghyun Kim, Wonpyo Park
We propose a novel distance-based regularization method for deep metric learning called Multi-level Distance Regularization (MDR). MDR explicitly disturbs a learning procedure by regularizing pairwise distances between embedding vectors into multiple
Externí odkaz:
https://explore.openaire.eu/search/publication?articleId=doi_dedup___::2a300efa5f3e071c3d79cfc4b85f15fd
http://arxiv.org/abs/2102.04223
http://arxiv.org/abs/2102.04223
Publikováno v:
Computer Vision – ECCV 2020 Workshops ISBN: 9783030664145
ECCV Workshops (1)
ECCV Workshops (1)
Mutual learning is an ensemble training strategy to improve generalization by transferring individual knowledge to each other while simultaneously training multiple models. In this work, we propose an effective mutual learning method for deep metric
Externí odkaz:
https://explore.openaire.eu/search/publication?articleId=doi_________::389e2c2d80197dfa51225f6cab1884fe
https://doi.org/10.1007/978-3-030-66415-2_49
https://doi.org/10.1007/978-3-030-66415-2_49
Publikováno v:
CVPR
In the field of face recognition, a model learns to distinguish millions of face images with fewer dimensional embedding features, and such vast information may not be properly encoded in the conventional model with a single branch. We propose a nove
Externí odkaz:
https://explore.openaire.eu/search/publication?articleId=doi_dedup___::c6ed429d1db0b3b804621e78cf840e38
Publikováno v:
Computer Vision – ECCV 2020 ISBN: 9783030585440
Externí odkaz:
https://explore.openaire.eu/search/publication?articleId=doi_________::e2760fe4765f0d0155475bdd0eda654a
https://doi.org/10.1007/978-3-030-58545-7_31
https://doi.org/10.1007/978-3-030-58545-7_31
Publikováno v:
CVPR
Knowledge distillation aims at transferring knowledge acquired in one model (a teacher) to another model (a student) that is typically smaller. Previous approaches can be expressed as a form of training the student to mimic output activations of indi
Externí odkaz:
https://explore.openaire.eu/search/publication?articleId=doi_dedup___::f85b0faeeaace17ce27f51959325e504
http://arxiv.org/abs/1904.05068
http://arxiv.org/abs/1904.05068