Variational Randomized Smoothing for Sample-Wise Adversarial Robustness
Autor: | Hase, Ryo, Wang, Ye, Koike-Akino, Toshiaki, Liu, Jing, Parsons, Kieran |
---|---|
Rok vydání: | 2024 |
Předmět: | |
Druh dokumentu: | Working Paper |
Popis: | Randomized smoothing is a defensive technique to achieve enhanced robustness against adversarial examples which are small input perturbations that degrade the performance of neural network models. Conventional randomized smoothing adds random noise with a fixed noise level for every input sample to smooth out adversarial perturbations. This paper proposes a new variational framework that uses a per-sample noise level suitable for each input by introducing a noise level selector. Our experimental results demonstrate enhancement of empirical robustness against adversarial attacks. We also provide and analyze the certified robustness for our sample-wise smoothing method. Comment: 20 pages, under preparation |
Databáze: | arXiv |
Externí odkaz: |