Robust learning with implicit residual networks
Autor: | Clayton G. Webster, Viktor Reshniak |
---|---|
Rok vydání: | 2019 |
Předmět: |
FOS: Computer and information sciences
Computer Science - Machine Learning Mathematical optimization lcsh:Computer engineering. Computer hardware Discretization Computer science lcsh:TK7885-7895 Machine Learning (stat.ML) 02 engineering and technology Fixed point Residual 01 natural sciences Residual neural network ResNet Machine Learning (cs.LG) robust Robust learning Statistics - Machine Learning Robustness (computer science) 0202 electrical engineering electronic engineering information engineering 0101 mathematics Hyperparameter 010102 general mathematics stability Nonlinear system 020201 artificial intelligence & image processing |
Zdroj: | Machine Learning and Knowledge Extraction, Vol 3, Iss 3, Pp 34-55 (2021) Machine Learning and Knowledge Extraction Volume 3 Issue 1 Pages 3-55 |
DOI: | 10.48550/arxiv.1905.10479 |
Popis: | In this effort, we propose a new deep architecture utilizing residual blocks inspired by implicit discretization schemes. As opposed to the standard feed-forward networks, the outputs of the proposed implicit residual blocks are defined as the fixed points of the appropriately chosen nonlinear transformations. We show that this choice leads to the improved stability of both forward and backward propagations, has a favorable impact on the generalization power, and allows for control the robustness of the network with only a few hyperparameters. In addition, the proposed reformulation of ResNet does not introduce new parameters and can potentially lead to a reduction in the number of required layers due to improved forward stability. Finally, we derive the memory-efficient training algorithm, propose a stochastic regularization technique, and provide numerical results in support of our findings. |
Databáze: | OpenAIRE |
Externí odkaz: |