Zobrazeno 1 - 5
of 5
pro vyhledávání: '"Karkar, Skander"'
Greedy layer-wise or module-wise training of neural networks is compelling in constrained and on-device settings where memory is limited, as it circumvents a number of problems of end-to-end back-propagation. However, it suffers from a stagnation pro
Externí odkaz:
http://arxiv.org/abs/2309.17357
We propose a detector of adversarial samples that is based on the view of neural networks as discrete dynamic systems. The detector tells clean inputs from abnormal ones by comparing the discrete vector fields they follow through the layers. We also
Externí odkaz:
http://arxiv.org/abs/2306.04252
End-to-end backpropagation has a few shortcomings: it requires loading the entire model during training, which can be impossible in constrained settings, and suffers from three locking problems (forward locking, update locking and backward locking),
Externí odkaz:
http://arxiv.org/abs/2210.00949
Neural networks have been achieving high generalization performance on many tasks despite being highly over-parameterized. Since classical statistical learning theory struggles to explain this behavior, much effort has recently been focused on uncove
Externí odkaz:
http://arxiv.org/abs/2009.08372
Publikováno v:
Machine Learning and Knowledge Discovery in Databases ISBN: 9783030676605
ECML/PKDD (2)
ECML PKDD
ECML PKDD, Sep 2020, Ghent, Belgium
ECML/PKDD (2)
ECML PKDD
ECML PKDD, Sep 2020, Ghent, Belgium
Neural networks have been achieving high generalization performance on many tasks despite being highly over-parameterized. Since classical statistical learning theory struggles to explain this behavior, much effort has recently been focused on uncove
Externí odkaz:
https://explore.openaire.eu/search/publication?articleId=doi_dedup___::1f54d010cf2ab8073e1295680d8c4287
https://doi.org/10.1007/978-3-030-67661-2_7
https://doi.org/10.1007/978-3-030-67661-2_7