Performance of the Ebel standard-setting method in spring 2019 Royal College of Physicians and Surgeons of Canada internal medicine certification examination consisted of multiple-choice questions.

Autor: Bourque J; Exam Quality and Analytics Unit, Royal College of Physicians and Surgeons of Canada, Ottawa, ON, Canada., Skinner H; Exam Quality and Analytics Unit, Royal College of Physicians and Surgeons of Canada, Ottawa, ON, Canada., Dupré J; Exam Quality and Analytics Unit, Royal College of Physicians and Surgeons of Canada, Ottawa, ON, Canada., Bacchus M; Department of Medicine, University of Calgary, Calgary, AB, Canada., Ainslie M; Department of Medicine, University of Manitoba, Winnipeg, MB, Canada., Ma IWY; Department of Medicine, University of Calgary, Calgary, AB, Canada., Cole G; Exam Quality and Analytics Unit, Royal College of Physicians and Surgeons of Canada, Ottawa, ON, Canada.
Jazyk: angličtina
Zdroj: Journal of educational evaluation for health professions [J Educ Eval Health Prof] 2020; Vol. 17, pp. 12. Date of Electronic Publication: 2020 Apr 20.
DOI: 10.3352/jeehp.2020.17.12
Abstrakt: Purpose: It aimed to know the performance of the Ebel standard-setting method in in spring 2019 Royal College of Physicians and Surgeons of Canada internal medicine certification examination consisted of multiple-choice questions. Specifically followings were searched: the inter-rater agreement; the correlation between Ebel scores and item facility indices; raters' knowledge of correct answers' impact on the Ebel score; and affection of rater's specialty on theinter-rater agreement and Ebel scores.
Methods: Data were drawn from a Royal College of Physicians and Surgeons of Canada certification exam. Ebel's method was applied to 203 MCQs by 49 raters. Facility indices came from 194 candidates. We computed Fleiss' kappa and the Pearson correlation between Ebel scores and item facility indices. We investigated differences in the Ebel score (correct answers provided or not) and differences between internists and other specialists with t-tests.
Results: Kappa was below 0.15 for facility and relevance. The correlation between Ebel scores and facility indices was low when correct answers were provided and negligible when they were not. The Ebel score was the same, whether the correct answers were provided or not. Inter-rater agreement and Ebel scores was not differentbetween internists and other specialists.
Conclusion: Inter-rater agreement and correlations between item Ebel scores and facility indices wee consistently low; furthermore, raters' knowledge of correct answer and rater specialty had no effect on Ebel scores in the present setting.
Databáze: MEDLINE