Design and Development of a Framework for an Automatic Answer Evaluation System Based on Similarity Measures
Autor: | Ramamurthy Madhumitha, Krishnamurthi Ilango |
---|---|
Jazyk: | angličtina |
Rok vydání: | 2017 |
Předmět: | |
Zdroj: | Journal of Intelligent Systems, Vol 26, Iss 2, Pp 243-262 (2017) |
Druh dokumentu: | article |
ISSN: | 0334-1860 2191-026X |
DOI: | 10.1515/jisys-2015-0031 |
Popis: | The assessment of answers is an important process that requires great effort from evaluators. This assessment process requires high concentration without any fluctuations in mood. This substantiates the need to automate answer script evaluation. Regarding text answer evaluation, sentence similarity measures have been widely used to compare student written answers with reference texts. In this paper, we propose an automated answer evaluation system that uses our proposed cosine-based sentence similarity measures to evaluate the answers. Cosine measures have proved to be effective in comparing between free text student answers and reference texts. Here we propose a set of novel cosine-based sentence similarity measures with varied approaches of creating document vector space. In addition to this, we propose a novel synset-based word similarity measure for computation of document vectors coupled with varied approaches for dimensionality-reduction for reducing vector space dimensions. Thus, we propose 21 cosine-based sentence similarity measures and measured their performance using MSR paraphrase corpus and Li’s benchmark datasets. We also use these measures for automatic answer evaluation system and compare their performances using the Kaggle short answer and essay dataset. The performance of the system-generated scores is compared with the human scores using Pearson correlation. The results show that system and human scores have correlation between each other. |
Databáze: | Directory of Open Access Journals |
Externí odkaz: |