Interpretable Dropout Prediction: Towards XAI-Based Personalized Intervention.

Autor: Nagy, Marcell, Molontay, Roland
Zdroj: International Journal of Artificial Intelligence in Education (Springer Science & Business Media B.V.); Jun2024, Vol. 34 Issue 2, p274-300, 27p
Abstrakt: Student drop-out is one of the most burning issues in STEM higher education, which induces considerable social and economic costs. Using machine learning tools for the early identification of students at risk of dropping out has gained a lot of interest recently. However, there has been little discussion on dropout prediction using interpretable machine learning (IML) and explainable artificial intelligence (XAI) tools.In this work, using the data of a large public Hungarian university, we demonstrate how IML and XAI tools can support educational stakeholders in dropout prediction. We show that complex machine learning models – such as the CatBoost classifier – can efficiently identify at-risk students relying solely on pre-enrollment achievement measures, however, they lack interpretability. Applying IML tools, such as permutation importance (PI), partial dependence plot (PDP), LIME, and SHAP values, we demonstrate how the predictions can be explained both globally and locally. Explaining individual predictions opens up great opportunities for personalized intervention, for example by offering the right remedial courses or tutoring sessions. Finally, we present the results of a user study that evaluates whether higher education stakeholders find these tools interpretable and useful. [ABSTRACT FROM AUTHOR]
Databáze: Complementary Index