AutoNLU: Detecting, root-causing, and fixing NLU model errors

Autor: Sethi, Pooja, Savenkov, Denis, Arabshahi, Forough, Goetz, Jack, Tolliver, Micaela, Scheffer, Nicolas, Kabul, Ilknur, Liu, Yue, Aly, Ahmed
Rok vydání: 2021
Předmět:
Druh dokumentu: Working Paper
Popis: Improving the quality of Natural Language Understanding (NLU) models, and more specifically, task-oriented semantic parsing models, in production is a cumbersome task. In this work, we present a system called AutoNLU, which we designed to scale the NLU quality improvement process. It adds automation to three key steps: detection, attribution, and correction of model errors, i.e., bugs. We detected four times more failed tasks than with random sampling, finding that even a simple active learning sampling method on an uncalibrated model is surprisingly effective for this purpose. The AutoNLU tool empowered linguists to fix ten times more semantic parsing bugs than with prior manual processes, auto-correcting 65% of all identified bugs.
Comment: 8 pages, 5 figures
Databáze: arXiv