Autor: |
Alireza M. Javid, Arun Venkitaraman, Mikael Skoglund, Saikat Chatterjee |
Jazyk: |
angličtina |
Rok vydání: |
2020 |
Předmět: |
|
Zdroj: |
EURASIP Journal on Advances in Signal Processing, Vol 2020, Iss 1, Pp 1-19 (2020) |
Druh dokumentu: |
article |
ISSN: |
1687-6180 |
DOI: |
10.1186/s13634-020-00695-2 |
Popis: |
Abstract We design a rectified linear unit-based multilayer neural network by mapping the feature vectors to a higher dimensional space in every layer. We design the weight matrices in every layer to ensure a reduction of the training cost as the number of layers increases. Linear projection to the target in the higher dimensional space leads to a lower training cost if a convex cost is minimized. An ℓ 2-norm convex constraint is used in the minimization to reduce the generalization error and avoid overfitting. The regularization hyperparameters of the network are derived analytically to guarantee a monotonic decrement of the training cost, and therefore, it eliminates the need for cross-validation to find the regularization hyperparameter in each layer. We show that the proposed architecture is norm-preserving and provides an invertible feature vector and, therefore, can be used to reduce the training cost of any other learning method which employs linear projection to estimate the target. |
Databáze: |
Directory of Open Access Journals |
Externí odkaz: |
|