Contextual Breach: Assessing the Robustness of Transformer-based QA Models

Autor: Saadat, Asir, Asad, Nahian Ibn, Ishmam, Md Farhan
Rok vydání: 2024
Předmět:
Druh dokumentu: Working Paper
Popis: Contextual question-answering models are susceptible to adversarial perturbations to input context, commonly observed in real-world scenarios. These adversarial noises are designed to degrade the performance of the model by distorting the textual input. We introduce a unique dataset that incorporates seven distinct types of adversarial noise into the context, each applied at five different intensity levels on the SQuAD dataset. To quantify the robustness, we utilize robustness metrics providing a standardized measure for assessing model performance across varying noise types and levels. Experiments on transformer-based question-answering models reveal robustness vulnerabilities and important insights into the model's performance in realistic textual input.
Databáze: arXiv