Adversarial Detector with Robust Classifier

Autor: Osakabe, Takayuki, Aprilpyone, Maungmaung, Shiota, Sayaka, Kiya, Hitoshi
Rok vydání: 2022
Předmět:
Druh dokumentu: Working Paper
Popis: Deep neural network (DNN) models are wellknown to easily misclassify prediction results by using input images with small perturbations, called adversarial examples. In this paper, we propose a novel adversarial detector, which consists of a robust classifier and a plain one, to highly detect adversarial examples. The proposed adversarial detector is carried out in accordance with the logits of plain and robust classifiers. In an experiment, the proposed detector is demonstrated to outperform a state-of-the-art detector without any robust classifier.
Databáze: arXiv