CERT-ED: Certifiably Robust Text Classification for Edit Distance

Autor: Huang, Zhuoqun, Marchant, Neil G, Ohrimenko, Olga, Rubinstein, Benjamin I. P.
Rok vydání: 2024
Předmět:
Druh dokumentu: Working Paper
Popis: With the growing integration of AI in daily life, ensuring the robustness of systems to inference-time attacks is crucial. Among the approaches for certifying robustness to such adversarial examples, randomized smoothing has emerged as highly promising due to its nature as a wrapper around arbitrary black-box models. Previous work on randomized smoothing in natural language processing has primarily focused on specific subsets of edit distance operations, such as synonym substitution or word insertion, without exploring the certification of all edit operations. In this paper, we adapt Randomized Deletion (Huang et al., 2023) and propose, CERTified Edit Distance defense (CERT-ED) for natural language classification. Through comprehensive experiments, we demonstrate that CERT-ED outperforms the existing Hamming distance method RanMASK (Zeng et al., 2023) in 4 out of 5 datasets in terms of both accuracy and the cardinality of the certificate. By covering various threat models, including 5 direct and 5 transfer attacks, our method improves empirical robustness in 38 out of 50 settings.
Comment: 22 pages, 3 figures, 12 tables. Include 11 pages of appendices
Databáze: arXiv