An Automated Explainable Educational Assessment System Built on LLMs

Autor: Li, Jiazheng, Bobrov, Artem, West, David, Aloisi, Cesare, He, Yulan
Rok vydání: 2024
Předmět:
Druh dokumentu: Working Paper
Popis: In this demo, we present AERA Chat, an automated and explainable educational assessment system designed for interactive and visual evaluations of student responses. This system leverages large language models (LLMs) to generate automated marking and rationale explanations, addressing the challenge of limited explainability in automated educational assessment and the high costs associated with annotation. Our system allows users to input questions and student answers, providing educators and researchers with insights into assessment accuracy and the quality of LLM-assessed rationales. Additionally, it offers advanced visualization and robust evaluation tools, enhancing the usability for educational assessment and facilitating efficient rationale verification. Our demo video can be found at https://youtu.be/qUSjz-sxlBc.
Comment: Accepted to AAAI 2025
Databáze: arXiv