Popis: |
BackgroundIn Ontario, Canada, a government agency known as Ontario Health is responsible for making audit and feedback reports available to all family physicians to encourage ongoing quality improvement. The confidential report provides summary data on 3 key areas of practice: safe prescription, cancer screening, and diabetes management. ObjectiveThis report was redesigned to improve its usability in line with evidence. The objective of this study was to explore how the redesign was perceived, with an emphasis on recipients’ understanding of the report and their engagement with it. MethodsWe conducted qualitative semistructured interviews with family physicians who had experience with both versions of the report recruited through purposeful and snowball sampling. We analyzed the transcripts following an emergent and iterative approach. ResultsSaturation was reached after 17 family physicians participated. In total, 2 key themes emerged as factors that affected the perceived usability of the report: alignment between the report and the recipients’ expectations and capacity to engage in quality improvement. Family physicians expected the report and its quality indicators to reflect best practices and to be valid and accurate. They also expected the report to offer feedback on the clinical activities they perceived to be within their control to change. Furthermore, family physicians expected the goal of the report to be aligned with their perspective on feasible quality improvement activities. Most of these expectations were not met, limiting the perceived usability of the report. The capacity to engage with audit and feedback was hindered by several organizational and physician-level barriers, including the lack of fit with the existing workflow, competing priorities, time constraints, and insufficient skills for bridging the gaps between their data and the corresponding desired actions. ConclusionsDespite recognized improvements in the design of the report to better align with best practices, it was not perceived as highly usable. Improvements in the presentation of the data could not overcome misalignment with family physicians’ expectations or the limited capacity to engage with the report. Integrating iterative evaluations informed by user-centered design can complement evidence-based guidance for implementation strategies. Creating a space for bringing together audit and feedback designers and recipients may help improve usability and effectiveness. |