Building the Plane As We Fly It

Autor: William Iobst
Rok vydání: 2015
Předmět:
Zdroj: Journal of Graduate Medical Education. 7:259-261
ISSN: 1949-8357
1949-8349
Popis: With the launch of the Outcomes Project in 2001, the graduate medical education (GME) community began the transition to competency-based medical education (CBME). While this launch promises revolutionary change to the status quo of GME, it is safe to say that the work of transitioning to a competency-based paradigm has been a slow evolutionary process that remains a work in progress. Given the enormous task of implementing true competency-based training, most educators realize that for the foreseeable future, GME will exist as a hybrid model of traditional and CBME components.1 The release of the Accreditation Council for Graduate Medical Education milestones, developed from the existing literature and expert consensus on dimensions of physician performance that are relevant in practice, hold significant promise to advance this field. However, the milestones are not a finished product. They must be vetted and refined, and faculty must be given specific professional development in the appropriate use of this framework. In this issue of the Journal of Graduate Medical Education, Dehon et al2 report on the use of a novel milestone-based, end-of-shift evaluation (ESE) of emergency medicine residents. This study highlights the challenge of advancing competency-based assessment. Using this ESE, the authors identified grade inflation or “milestone inflation,” and concluded that use of milestones as a stand-alone assessment tool is not advised. While their findings question the utility of this milestone-based assessment strategy, caution is advised when applying their findings to milestone-based assessment in general. To appreciate the challenge of developing milestone-based assessment, a review of a number of basic principles of CBME and of milestones is helpful. CBME is an outcomes-based approach to the design, implementation, and evaluation of a medical education program using an organizing framework of competencies.3 Rather than relying on time and process measures as proxies of competence, CBME aims to document the developmental progression of competence through authentic, preferably work-based assessment. This fundamental difference has profound implications regarding the learning environment, expectations of both faculty and learners, and the approach to assessment. Medical educators must appreciate these differences to most effectively develop and pilot milestones and competency-based innovations. Key concepts include: 1. Milestones, written as measureable and observable behaviors, define a developmental continuum of knowledge, skills, and attitudes in each of the general competencies. Essentially acting as a roadmap, milestones are intended to facilitate criteria-referenced and preferably work-based assessment of trainees. With appropriate faculty development, the developmental continuum they describe can create a common understanding of competence to inform the use of specific assessment tools, such as direct observation or chart-stimulated recall. Written in narrative form, milestones capitalize on literature identifying the value of narrative descriptions rather than numerical scales for assessment.4 The milestones describe the gradual, developmental acquisition of competence and are specifically intended to provide a framework informing formative assessment and feedback.5,6 Finally, the milestones are formulated to serve as a framework or guide for longitudinal assessment and a group decision process for determining competence, not a focal point in time assessment.7 Using specific milestones abstracted from that developmental continuum to inform yes/no decisions in the very brief assessment window of a single emergency medicine shift risks loss of these benefits, and may potentially explain the disappointing performance of the ESE reported by Dehon et al.2 The use of milestones as a simple checklist also risks the deconstruction of the highly complex work of patient care. This realization, and the recognition that the resources required to assess all possible milestones are prohibitive, has led to increased interest in entrustment-based assessment, or entrustable professional activities (EPAs).8,9 EPAs represent an assessment strategy that focuses assessment on the actual work of patient care: those discrete activities that all physicians are trusted to do. By focusing assessment on activities that incorporate multiple milestones and general competencies, EPA-based assessment offers the potential for a more manageable approach to work-based assessment. 2. Competency-based assessment ideally serves the dual purpose of providing feedback both “of” and “for” learning. While the assessment “of” learning provides more summative feedback and is intended to inform decisions about progression, the assessment “for” learning is formative and is intended to support and enhance learning.10,11 Dehon and associates2 describe a summative assessment using preselected milestones that have either been achieved or not (Yes, No, N/A). By capturing faculty perceptions (or comments), such as “felt bad giving a no,” the authors illuminate some of the pitfalls of this strategy.2 By providing the full milestone continuum for review, faculty might have selected a milestone more appropriate to the learner's level of competence without the sense that they were making a high-stakes decision or assigning a grade. Ideally, faculty should offer, and learners should embrace, feedback identifying both accomplishments and gaps in performance to advance competence in any given milestone. 3. Milestones were written to describe discrete observable abilities. To best assess these abilities, many milestone-writing committees envisioned the use of authentic, work-based assessment. Such assessment would include direct observation. However, as noted by Howley and Wilson,12 direct observation of learners occurs infrequently. Assessment using direct observation is complex and requires significant faculty development, but this should not prevent our community from embracing this assessment strategy.13 The method(s) of assessment that informed the decision of “Yes, No, N/A” in the Dehon et al2 study are not specified. However, if the determination was a global or gestalt assessment, the opportunity to revisit this study using direct observation informed by specific criteria-referenced standards, which define the learning expectations of that rotation or shift, could have potentially generated a different outcome. 4. Developing milestone-based assessments will require the significant professional development of all key stakeholders involved in the GME enterprise. Studies by Crosson et al14 and Mattar et al15 highlight that despite being well over a decade into the Outcomes Project, physicians complete graduate medical training and enter the workforce lacking competence in a wide range of fundamental skills needed for safe and effective patient care. Such findings suggest that there are dimensions of physician practice in which we are neither teaching nor assessing our trainees effectively. Writing a milestone framework that defines the general competencies has been celebrated as a significant accomplishment. However, the act of writing milestones alone is not sufficient. Complementary work must be directed toward ensuring that GME educators understand how to utilize the milestones when developing curricula and assessments for specific rotations. Of equal importance, our learners must understand that they are not assumed to be competent on day 1 of their training. They should expect robust assessment and feedback intended to guide them along a developmental path to competence. In this issue of JGME, Dehon and associates2 highlight the challenging work of developing milestone-based assessment. Some might interpret the findings of their study as confirmation that milestones are ill conceived and then might call for better framework to guide competency-based assessment. While this may be an appropriate conclusion, I disagree. I believe the authors have highlighted the importance of understanding just how profoundly CBME is changing the status quo of GME. CBME represents a revolutionary and disruptive change. Any innovation intended to advance CMBE must appreciate the extent of this change and must ensure that the basic principles of CBME, as outlined above, inform study design. Well over a decade into the current CBME experiment, we cannot shy away from the hard work of advancing this field. The work must ultimately define competency-based assessment systems that will reliably inform the use of milestones in the attestation of competence. Given the limited resources currently available to accomplish this charge, an integrated effort engaging the education community and both GME accreditors and funders should be formalized to guide this work. I applaud Dehon and colleagues2 for advancing our knowledge of milestone assessment by highlighting that not all innovations will achieve desired outcomes. We will misstep as we forge forward, yet we must ensure that all of our learnings continually advance the collective wisdom of our community. What is certain is that a return to the past is not the answer.
Databáze: OpenAIRE