Expanded Access to Video-Based Laparoscopic Skills Assessments: Ease, Reliability, and Accuracy.

Autor: Lund S; Mayo Clinic Department of Surgery, 200 1st Street SW, Rochester, Minnesota 55905. Electronic address: lund.sarah@mayo.edu., Navarro S; Mayo Clinic Department of Surgery, 200 1st Street SW, Rochester, Minnesota 55905., D'Angelo JD; Mayo Clinic Division of Colon and Rectal Surgery, 200 1st Street SW, Rochester, Minnesota 55905., Park YS; Department of Medical Education, University of Illinois at Chicago College of Medicine, 808 S Wood Street, Chicago Illinois 60612., Rivera M; Mayo Clinic Division of Trauma, Critical Care, and General Surgery, 200 1st Street SW, Rochester, Minnesota 55905.
Jazyk: angličtina
Zdroj: Journal of surgical education [J Surg Educ] 2024 Jun; Vol. 81 (6), pp. 850-857. Date of Electronic Publication: 2024 Apr 24.
DOI: 10.1016/j.jsurg.2024.03.010
Abstrakt: Objective: Video-based performance assessments provide essential feedback to surgical residents, but in-person and remote video-based assessment by trained proctors incurs significant cost. We aimed to determine the reliability, accuracy, and difficulty of untrained attending staff surgeon raters completing video-based assessments of a basic laparoscopic skill. Secondarily, we aimed to compare reliability and accuracy between 2 different types of assessment tools.
Design: An anonymous survey was distributed electronically to surgical attendings via a national organizational listserv. Survey items included demographics, rating of video-based assessment experience (1 = have never completed video-based assessments, 5 = often complete video-based assessments), and rating of favorability toward video-based and in-person assessments (0 = not favorable, 100 = favorable). Participants watched 2 laparoscopic peg transfer performances, then rated each performance using an Objective Structured Assessment of Technical Skill (OSATS) form and the McGill Inanimate System for Training and Evaluation of Laparoscopic Skills (MISTELS). Participants then rated assessment completion ease (1 = Very Easy, 5 = Very Difficult).
Setting: National survey of practicing surgeons.
Participants: Sixty-one surgery attendings with experience in laparoscopic surgery from 10 institutions participated as untrained raters. Six experienced laparoscopic skills proctors participated as expert raters.
Results: Inter-rater reliability was substantial for both OSATS (k = 0.75) and MISTELS (k = 0.85). MISTELS accuracy was significantly higher than that of OSATS (κ: MISTELS = 0.18, 95%CI = [0.06,0.29]; OSATS = 0.02, 95%CI = [-0.01,0.04]). While participants were inexperienced with completing video-based assessments (median = 1/5), they perceived video-based assessments favorably (mean = 73.4) and felt assessment completion was "Easy" on average.
Conclusions: We demonstrate that faculty raters untrained in simulation-based assessments can successfully complete video-based assessments of basic laparoscopic skills with substantial inter-rater reliability without marked difficulty. These findings suggest an opportunity to increase access to feedback for trainees using video-based assessment of fundamental skills in laparoscopic surgery.
(Copyright © 2024 Association of Program Directors in Surgery. Published by Elsevier Inc. All rights reserved.)
Databáze: MEDLINE