A97 TOOLS FOR DIRECT OBSERVATION AND ASSESSMENT OF COLONOSCOPY: A SYSTEMATIC REVIEW OF VALIDITY EVIDENCE

Autor: C M Walsh, Rishad Khan, Nikko Gimpaya, G McCreath, Samir C. Grover, E Zheng, Sachin Wani, John T. Anderson, Michael A. Scaffidi, Thurarshen Jeyalingam
Rok vydání: 2021
Předmět:
Zdroj: Journal of the Canadian Association of Gastroenterology. 4:71-73
ISSN: 2515-2092
2515-2084
Popis: Background An increasing focus on quality and safety in colonoscopy has led to broader implementation of competency-based educational systems that enable documentation of trainees’ achievement of the knowledge, skills, and attitudes needed for independent practice. The meaningful assessment of competence in colonoscopy is critical to this process. While there are many published tools that assess competence in performing colonoscopy, there is a wide range of underlying validity evidence. Tools with strong evidence of validity are required to support feedback provision, optimize learner capabilities, and document competence. Aims We aimed to evaluate the strength of validity evidence that supports available colonoscopy direct observation assessment tools using the unified framework of validity. Methods We systematically searched five databases for studies investigating colonoscopy direct observation assessment tools from inception until April 8, 2020. We extracted data outlining validity evidence from the five sources (content, response process, internal structure, relations to other variables, and consequences) and graded the degree of evidence, with a maximum score of 15. We assessed educational utility using an Accreditation Council for Graduate Medical Education framework and methodological quality using the Medical Education Research Quality Instrument (MERSQI). Results From 10,841 records, we identified 27 studies representing 13 assessment tools (10 adult, 2 pediatric, 1 both). All tools assessed technical skills, while 10 assessed cognitive and integrative skills. Validity evidence scores ranged from 1–15. The Assessment of Competency in Endoscopy (ACE) tool, the Direct Observation of Procedural Skills (DOPS) tool, and the Gastrointestinal Endoscopy Competency Assessment Tool (GiECAT) had the strongest validity evidence, with scores of 13, 15, and 14, respectively. Most tools were easy to use and interpret and required minimal resources. MERSQI scores ranged from 9.5–11.5 (maximum score 14.5). Conclusions The ACE, DOPS, and GiECAT have strong validity evidence compared to other assessments. Future studies should identify barriers to widespread implementation and report on use of these tools in credentialing purposes. Funding Agencies None
Databáze: OpenAIRE