Rating the Quality of Entrustable Professional Activities: Content Validation and Associations with the Clinical Context.

Autor: Post, Jason, Wittich, Christopher, Thomas, Kris, Dupras, Denise, Halvorsen, Andrew, Mandrekar, Jay, Oxentenko, Amy, Beckman, Thomas, Post, Jason A, Wittich, Christopher M, Thomas, Kris G, Dupras, Denise M, Halvorsen, Andrew J, Mandrekar, Jay N, Oxentenko, Amy S, Beckman, Thomas J
Předmět:
Zdroj: JGIM: Journal of General Internal Medicine; May2016, Vol. 31 Issue 5, p518-523, 6p
Abstrakt: Background: Entrustable professional activities (EPAs) have been developed to assess resident physicians with respect to Accreditation Council for Graduate Medical Education (ACGME) competencies and milestones. Although the feasibility of using EPAs has been reported, we are unaware of previous validation studies on EPAs and potential associations between EPA quality scores and characteristics of educational programs.Objectives: Our aim was to validate an instrument for assessing the quality of EPAs for assessment of internal medicine residents, and to examine associations between EPA quality scores and features of rotations.Design: This was a prospective content validation study to design an instrument to measure the quality of EPAs that were written for assessing internal medicine residents.Participants: Residency leadership at Mayo Clinic, Rochester participated in this study. This included the Program Director, Associate program directors and individual rotation directors.Interventions: The authors reviewed salient literature. Items were developed to reflect domains of EPAs useful for assessment. The instrument underwent further testing and refinement. Each participating rotation director created EPAs that they felt would be meaningful to assess learner performance in their area. These 229 EPAs were then assessed with the QUEPA instrument to rate the quality of each EPA.Main Measures: Performance characteristics of the QUEPA are reported. Quality ratings of EPAs were compared to the primary ACGME competency, inpatient versus outpatient setting and specialty type.Key Results: QUEPA tool scores demonstrated excellent reliability (ICC range 0.72 to 0.94). Higher ratings were given to inpatient versus outpatient (3.88, 3.66; p = 0.03) focused EPAs. Medical knowledge EPAs scored significantly lower than EPAs assessing other competencies (3.34, 4.00; p < 0.0001).Conclusions: The QUEPA tool is supported by good validity evidence and may help in rating the quality of EPAs developed by individual programs. Programs should take care when writing EPAs for the outpatient setting or to assess medical knowledge, as these tended to be rated lower. [ABSTRACT FROM AUTHOR]
Databáze: Complementary Index