Autor: |
Kaziakhmedov, Edgar, Kireev, Klim, Melnikov, Grigorii, Pautov, Mikhail, Petiushko, Aleksandr |
Rok vydání: |
2019 |
Předmět: |
|
Zdroj: |
2019 International Multi-Conference on Engineering, Computer and Information Sciences (SIBIRCON) |
Druh dokumentu: |
Working Paper |
DOI: |
10.1109/SIBIRCON48586.2019.8958122 |
Popis: |
Recent studies proved that deep learning approaches achieve remarkable results on face detection task. On the other hand, the advances gave rise to a new problem associated with the security of the deep convolutional neural network models unveiling potential risks of DCNNs based applications. Even minor input changes in the digital domain can result in the network being fooled. It was shown then that some deep learning-based face detectors are prone to adversarial attacks not only in a digital domain but also in the real world. In the paper, we investigate the security of the well-known cascade CNN face detection system - MTCNN and introduce an easily reproducible and a robust way to attack it. We propose different face attributes printed on an ordinary white and black printer and attached either to the medical face mask or to the face directly. Our approach is capable of breaking the MTCNN detector in a real-world scenario. |
Databáze: |
arXiv |
Externí odkaz: |
|