Reading Comprehension Quiz Generation using Generative Pre-trained Transformers
Autor: | Dijkstra, R., Genç, Z., Kayal, S., Kamps, J., Sosnovsky, S., Brusilovsky, P., Lan, A. |
---|---|
Přispěvatelé: | ILLC (FGw) |
Jazyk: | angličtina |
Rok vydání: | 2022 |
Zdroj: | Proceedings of the Fourth International Workshop on Intelligent Textbooks 2022: co-located with 23d International Conference on Artificial Intelligence in Education (AIED 2022) : Durham, UK, July 27, 2022, 4-17 STARTPAGE=4;ENDPAGE=17;TITLE=Proceedings of the Fourth International Workshop on Intelligent Textbooks 2022 |
Popis: | Recent advances in AI have resulted in large pre-trained language models with superior performance on text generation tasks, prompting the question of whether we can use them to generate educationally useful text completions. This holds the potential to generate relevant quizzes for any educational text, greatly complementing current formative and summative tests from education professionals. We explore pre-trained language models for quiz generation on reading comprehension texts and propose EduQuiz, an end-to-end quiz generator based on a GPT-3 model fine-tuned on text-quiz pairs, able to generate a complete multiple-choice question, with the correct and distractor answers. We observed that the majority of generated quizzes is reasonable, and that generation of high-quality distractors is more challenging than question and answer generation. More generally, while it may be too early to replace manually generated tests for summative feedback and grading with automatic quiz generation, EduQuiz already has potential value for formative feedback and to increase engagement during the learning phase by enhancing textbooks with assessments. |
Databáze: | OpenAIRE |
Externí odkaz: |