Fast Few-shot Debugging for NLU Test Suites

Autor: Malon, Christopher, Li, Kai, Kruus, Erik
Rok vydání: 2022
Předmět:
Druh dokumentu: Working Paper
Popis: We study few-shot debugging of transformer based natural language understanding models, using recently popularized test suites to not just diagnose but correct a problem. Given a few debugging examples of a certain phenomenon, and a held-out test set of the same phenomenon, we aim to maximize accuracy on the phenomenon at a minimal cost of accuracy on the original test set. We examine several methods that are faster than full epoch retraining. We introduce a new fast method, which samples a few in-danger examples from the original training set. Compared to fast methods using parameter distance constraints or Kullback-Leibler divergence, we achieve superior original accuracy for comparable debugging accuracy.
Comment: To appear at ACL 2022 Deep Learning Inside Out (DeeLIO) workshop
Databáze: arXiv