Zobrazeno 1 - 5
of 5
pro vyhledávání: '"Summers, Cecilia"'
Autor:
Summers, Cecilia, Dinneen, Michael J.
Nondeterminism in neural network optimization produces uncertainty in performance, making small improvements difficult to discern from run-to-run variability. While uncertainty can be reduced by training multiple model copies, doing so is time-consum
Externí odkaz:
http://arxiv.org/abs/2103.04514
Autor:
Summers, Cecilia, Dinneen, Michael J.
While great progress has been made at making neural networks effective across a wide range of visual tasks, most models are surprisingly vulnerable. This frailness takes the form of small, carefully chosen perturbations of their input, known as adver
Externí odkaz:
http://arxiv.org/abs/1906.03749
Autor:
Summers, Cecilia, Dinneen, Michael J.
A key component of most neural network architectures is the use of normalization layers, such as Batch Normalization. Despite its common use and large utility in optimizing deep architectures, it has been challenging both to generically improve upon
Externí odkaz:
http://arxiv.org/abs/1906.03548
Autor:
Summers, Cecilia, Dinneen, Michael J.
In order to reduce overfitting, neural networks are typically trained with data augmentation, the practice of artificially generating additional training data via label-preserving transformations of existing training examples. While these types of tr
Externí odkaz:
http://arxiv.org/abs/1805.11272
Publikováno v:
Fortnight, 1997 Jul 01(363), 4-4.
Externí odkaz:
https://www.jstor.org/stable/25559385