An audit of assessment tools in a medical school in eastern saudi arabia.
Autor: | Al-Rubaish AM; Department of Internal Medicine, College of Medicine, King Faisal University, Dammam, Saudi Arabia., Al-Umran KU, Wosornu L |
---|---|
Jazyk: | angličtina |
Zdroj: | Journal of family & community medicine [J Family Community Med] 2005 May; Vol. 12 (2), pp. 101-5. |
Abstrakt: | Background: Assessment has a powerful influence on curriculum delivery. Medical instructors must use tools which conform to educational principles, and audit them as part of curriculum review. Aim: To generate information to support recommendations for improving curriculum delivery. Setting: Pre-clinical and clinical departments in a College of Medicine, Saudi Arabia. Method: A self-administered questionnaire was used in a cross-sectional survey to see if assessment tools being used met basic standards of validity, reliability and currency, and if feedback to students was adequate. Excluded were cost, feasibility and tool combinations. Results: Thirty-one (out of 34) courses were evaluated. All 31 respondents used MCQs, especially one-best (28/31) and true/false (13/31). Groups of teachers selected test questions mostly. Pre-clinical departments sourced equally from "new" (10/14) and "used" (10/14) MCQs; clinical departments relied on 'banked' MCQs (16/17). Departments decided pass marks (28/31) and chose the College-set 60%; the timing was pre-examination in 13/17 clinical but post-examination in 5/14 pre-clinical departments. Of six essay users, five used model answers but only one did double marking. OSCE was used by 7/17 clinical departments; five provided checklist. Only 3/31 used optical reader. Post-marking review was done by 13/14 pre-clinical but 10/17 clinical departments. Difficulty and discriminating indices were determined by only 4/31 departments. Feedback was provided by 12/14 pre-clinical and 7/17 clinical departments. Only 10/31 course coordinators had copies of examination regulations. Recommendations: MCQ with single-best answer, if properly constructed and adequately critiqued, is the preferred tool for assessing theory domain. However, there should be fresh questions, item analyses, comparisons with pervious results, optical reader systems and double marking. Departments should use OSCE or OSPE more often. Long essays, true/false, fill-in-the-blank-spaces and more-than-one-correct-answer can be safely abolished. Departments or teams should set test papers and collectively take decisions. Feedback rates should be improved. A Center of Medical Education, including an Examination Center is required. Fruitful future studies can be repeat audit, use of "negative questions" and the number of MCQs per test paper. Comparative audit involving other regional medical schools may be of general interest. |
Databáze: | MEDLINE |
Externí odkaz: |