Program Analysis Tools in Automated Grading of Homework Assignments
Autor: | Ana Milanova, Buster O. Holzbauer, Samuel Breese, Barbara Cutler, Elizabeth Dinella, Evan Maicus, Matthew Peveler |
---|---|
Rok vydání: | 2018 |
Předmět: |
Unit testing
Computer science business.industry media_common.quotation_subject 05 social sciences Code coverage 050301 education Inference Static program analysis 02 engineering and technology Program analysis Debugging 020204 information systems ComputingMilieux_COMPUTERSANDEDUCATION 0202 electrical engineering electronic engineering information engineering Use case Software engineering business Grading (education) 0503 education media_common |
Zdroj: | SIGCSE |
DOI: | 10.1145/3159450.3162238 |
Popis: | With surging enrollment in Computer Science courses at both the introductory and advanced level, it is critical to leverage automated testing and grading to ensure consistent assessment of student learning. Program analysis tools allow us to streamline the grading process so instructors and TAs can spend more time teaching, one-on-one tutoring, and mentoring students. We present complex use cases of automated assignment testing and grading within the open-source homework submission system, Submitty. Students receive immediate and detailed feedback from the automated grader, and can resubmit to correct errors. Submitty uses custom-built grading tools, including difference checking of plaintext program output, instructor authored assignment-specific custom graders, and static analysis tools that reason about program structure. In addition, it employs a variety of external tools, including version control (Git and SVN), unit testing frameworks (JUnit), memory debugging tools (Valgrind and DrMemory), and code coverage tools (Emma). In this poster we describe our experience with memory debugging and code coverage tools, and outline plans to include immutability inference and verification. |
Databáze: | OpenAIRE |
Externí odkaz: |