Crowdsourced Assessment of Inanimate Biotissue Drills: A Valid and Cost-Effective Way to Evaluate Surgical Trainees.

Autor: Rice MK; University of Maryland School of Medicine, Baltimore, Maryland., Zenati MS; Department of Surgery, University of Pittsburgh Medical Center, Pittsburgh, Pennsylvania., Novak SM; Department of Surgery, Northshore University HealthSystem, Chicago, Illinois., Al Abbas AI; Division of Surgical Oncology, University of Pittsburgh Medical Center, Pittsburgh, Pennsylvania., Zureikat AH; Division of Surgical Oncology, University of Pittsburgh Medical Center, Pittsburgh, Pennsylvania., Zeh HJ 3rd; Department of Surgery, University of Texas Southwestern Medical Center, Dallas, Texas., Hogg ME; Department of Surgery, Northshore University HealthSystem, Chicago, Illinois. Electronic address: MHogg@Northshore.org.
Jazyk: angličtina
Zdroj: Journal of surgical education [J Surg Educ] 2019 May - Jun; Vol. 76 (3), pp. 814-823. Date of Electronic Publication: 2018 Nov 22.
DOI: 10.1016/j.jsurg.2018.10.007
Abstrakt: Objective: Providing feedback to surgical trainees is a critical component for assessment of technical skills, yet remains costly and time consuming. We hypothesize that statistical selection can identify a homogenous group of nonexpert crowdworkers capable of accurately grading inanimate surgical video.
Design: Applicants auditioned by grading 9 training videos using the Objective Structured Assessment of Technical Skills (OSATS) tool and an error-based checklist. The summed OSATS, summed errors, and OSATS summary score were tested for outliers using Cronbach's Alpha and single measure intraclass correlation. Accepted crowdworkers then submitted grades for videos in 3 different compositions: full video 1× speed, full video 2× speed, and critical section segmented video. Graders were blinded to this study and a similar statistical analysis was performed.
Setting: The study was conducted at the University of Pittsburgh Medical Center (Pittsburgh, PA), a tertiary care academic teaching hospital.
Participants: Thirty-six premedical students participated as crowdworker applicants and 2 surgery experts were compared as the gold-standard.
Results: The selected hire intraclass correlation was 0.717 for Total Errors and 0.794 for Total OSATS for the first hire group and 0.800 for Total OSATS and 0.654 for Total Errors for the second hire group. There was very good correlation between full videos at 1× and 2× speed with an interitem statistic of 0.817 for errors and 0.86 for OSATS. Only moderate correlation was found with critical section segments. In 1 year 275hours of inanimate video was graded costing $22.27/video or $1.03/minute.
Conclusions: Statistical selection can be used to identify a homogenous cohort of crowdworkers used for grading trainees' inanimate drills. Crowdworkers can distinguish OSATS metrics and errors in full videos at 2× speed but were less consistent with segmented videos. The program is a comparatively cost-effective way to provide feedback to surgical trainees.
(Copyright © 2018 Association of Program Directors in Surgery. Published by Elsevier Inc. All rights reserved.)
Databáze: MEDLINE