Parameter identifiability of a deep feedforward ReLU neural network
Autor: | Bona-Pellissier, Joachim, Bachoc, François, Malgouyres, François |
---|---|
Přispěvatelé: | Institut de Mathématiques de Toulouse UMR5219 (IMT), Université Toulouse Capitole (UT Capitole), Université de Toulouse (UT)-Université de Toulouse (UT)-Institut National des Sciences Appliquées - Toulouse (INSA Toulouse), Institut National des Sciences Appliquées (INSA)-Université de Toulouse (UT)-Institut National des Sciences Appliquées (INSA)-Université Toulouse - Jean Jaurès (UT2J), Université de Toulouse (UT)-Université Toulouse III - Paul Sabatier (UT3), Université de Toulouse (UT)-Centre National de la Recherche Scientifique (CNRS), Institut National des Sciences Appliquées - Toulouse (INSA Toulouse), Institut National des Sciences Appliquées (INSA)-Institut National des Sciences Appliquées (INSA)-Université Toulouse 1 Capitole (UT1), Université Fédérale Toulouse Midi-Pyrénées-Université Fédérale Toulouse Midi-Pyrénées-Université Toulouse - Jean Jaurès (UT2J)-Université Toulouse III - Paul Sabatier (UT3), Université Fédérale Toulouse Midi-Pyrénées-Centre National de la Recherche Scientifique (CNRS), Our work has benefited from the AI Interdisciplinary Institute ANITI. ANITI is funded by the French'Investing for the Future – PIA3' program under the Grant agreement n°ANR-19-PI3A-0004., ANR-19-PI3A-0004,Future - PI3A |
Jazyk: | angličtina |
Rok vydání: | 2021 |
Předmět: |
FOS: Computer and information sciences
Mathematics - Statistics Theory Machine Learning (stat.ML) Statistics Theory (math.ST) ReLU networks ACM: G.: Mathematics of Computing Deep Learning [STAT.ML]Statistics [stat]/Machine Learning [stat.ML] Statistics - Machine Learning Parameter recovery [MATH.MATH-ST]Mathematics [math]/Statistics [math.ST] FOS: Mathematics Equivalent parameters Computer Science::Databases Symmetries |
Popis: | The possibility for one to recover the parameters-weights and biases-of a neural network thanks to the knowledge of its function on a subset of the input space can be, depending on the situation, a curse or a blessing. On one hand, recovering the parameters allows for better adversarial attacks and could also disclose sensitive information from the dataset used to construct the network. On the other hand, if the parameters of a network can be recovered, it guarantees the user that the features in the latent spaces can be interpreted. It also provides foundations to obtain formal guarantees on the performances of the network. It is therefore important to characterize the networks whose parameters can be identified and those whose parameters cannot. In this article, we provide a set of conditions on a deep fully-connected feedforward ReLU neural network under which the parameters of the network are uniquely identified-modulo permutation and positive rescaling-from the function it implements on a subset of the input space. |
Databáze: | OpenAIRE |
Externí odkaz: |