Identifying good forecasters via adaptive cognitive tests

Autor: Merkle, Edgar C., Petrov, Nikolay, Zhu, Sophie Ma, Karger, Ezra, Tetlock, Philip E., Himmelstein, Mark
Rok vydání: 2024
Předmět:
Druh dokumentu: Working Paper
Popis: Assessing forecasting proficiency is a time-intensive activity, often requiring us to wait months or years before we know whether or not the reported forecasts were good. In this study, we develop adaptive cognitive tests that predict forecasting proficiency without the need to wait for forecast outcomes. Our procedures provide information about which cognitive tests to administer to each individual, as well as how many cognitive tests to administer. Using item response models, we identify and tailor cognitive tests to assess forecasters of different skill levels, aiming to optimize accuracy and efficiency. We show how the procedures can select highly-informative cognitive tests from a larger battery of tests, reducing the time taken to administer the tests. We use a second, independent dataset to show that the selected tests yield scores that are highly related to forecasting proficiency. This approach enables real-time, adaptive testing, providing immediate insights into forecasting talent in practical contexts.
Databáze: arXiv