Autor: |
Khalfaoui, Beyrem, Boyd, Joseph, Vert, Jean-Philippe |
Přispěvatelé: |
Centre de Bioinformatique (CBIO), MINES ParisTech - École nationale supérieure des mines de Paris, Université Paris sciences et lettres (PSL)-Université Paris sciences et lettres (PSL), Cancer et génome: Bioinformatique, biostatistiques et épidémiologie d'un système complexe, Université Paris sciences et lettres (PSL)-Université Paris sciences et lettres (PSL)-Institut Curie [Paris]-Institut National de la Santé et de la Recherche Médicale (INSERM), Google Inc, Research at Google |
Jazyk: |
angličtina |
Rok vydání: |
2019 |
Předmět: |
|
Popis: |
Dropout is a regularisation technique in neural network training where unit activations are randomly set to zero with a given probability independently. In this work, we propose a generalisation of dropout and other multiplicative noise injection schemes for shallow and deep neural networks, where the random noise applied to different units is not independent but follows a joint distribution that is either fixed or estimated during training. We provide theoretical insights on why such adaptive structured noise injection (ASNI) may be relevant, and empirically confirm that it helps boost the accuracy of simple feedforward and convolutional neural networks, disentangles the hidden layer representations, and leads to sparser representations. Our proposed method is a straightforward modification of the classical dropout and does not require additional computational overhead. |
Databáze: |
OpenAIRE |
Externí odkaz: |
|