Measuring AI Systems Beyond Accuracy

Autor: Turri, Violet, Dzombak, Rachel, Heim, Eric, VanHoudnos, Nathan, Palat, Jay, Sinha, Anusha
Rok vydání: 2022
Předmět:
Druh dokumentu: Working Paper
Popis: Current test and evaluation (T&E) methods for assessing machine learning (ML) system performance often rely on incomplete metrics. Testing is additionally often siloed from the other phases of the ML system lifecycle. Research investigating cross-domain approaches to ML T&E is needed to drive the state of the art forward and to build an Artificial Intelligence (AI) engineering discipline. This paper advocates for a robust, integrated approach to testing by outlining six key questions for guiding a holistic T&E strategy.
Comment: 8 pages, Presented at 2022 AAAI Spring Symposium Series Workshop on AI Engineering: Creating Scalable, Human-Centered and Robust AI Systems
Databáze: arXiv