Zobrazeno 1 - 5
of 5
pro vyhledávání: '"Araghi, Amin"'
Knowledge Distillation (KD) has proven effective for compressing large teacher models into smaller student models. While it is well known that student models can achieve similar accuracies as the teachers, it has also been shown that they nonetheless
Externí odkaz:
http://arxiv.org/abs/2402.03119
Publikováno v:
2023 IEEE/CVF International Conference on Computer Vision (ICCV), Paris, France, 2023, pp. 1922-1933
Despite being highly performant, deep neural networks might base their decisions on features that spuriously correlate with the provided labels, thus hurting generalization. To mitigate this, 'model guidance' has recently gained popularity, i.e. the
Externí odkaz:
http://arxiv.org/abs/2303.11932
Publikováno v:
In Diamond & Related Materials March 2024 143
Deep neural networks are highly performant, but might base their decision on spurious or background features that co-occur with certain classes, which can hurt generalization. To mitigate this issue, the usage of 'model guidance' has gained popularit
Externí odkaz:
https://explore.openaire.eu/search/publication?articleId=doi_dedup___::5da59412292d856a9093cf840927ebca
Autor:
Araghi, Amin1 Amin_araghi_1368@yahoo.com
Publikováno v:
Applied Computational Electromagnetics Society Journal. Dec2016, Vol. 31 Issue 12, p1416-1420. 5p.