Interrater Agreement of The Copenhagen Triage Algorithm

Autor: Lisbet Ravn, Martin Schultz, Morten Lind, Julie Inge-Marie Helene Borchsenius, R. B. Hasselbalch, Kasper Kermark Iversen, Thomas Kallemose, Lars S. Rasmussen
Rok vydání: 2020
Předmět:
Popis: Introduction Systematic triage is performed in the Emergency Department (ED) to assess the urgency of care for each patient. The Copenhagen Triage Algorithm (CTA) is a newly developed, evidence-based triage system, however the interrater agreement remains unknown. Method This was a prospective cohort study. The collection of data was conducted in the three sections (Acute/Cardiology, Medicine and Surgery) of the ED of Herlev Hospital. Patients were assessed independently by two different nurses using CTA. The interrater variability of CTA was calculated using Fleiss kappa. The analysis was stratified according to less or more than 2 years of ED experience. Results A total of 110 patients were included of which 10 were excluded due to incomplete data. The raters agreed on triage category 80 % of the time corresponding to a kappa value of 0.70 (95% confidence interval 0.57-0.83). Stratified on ED sections, the agreement was 83 % in the Acute/Cardiology section corresponding to a kappa value of 0.73 (0.55-0.91), 79 % in the Medicine section corresponding to a kappa value of 0.64 (0.39-0.89) and 0.56 % in the Surgery section corresponding to a kappa value of 0.56 (0.21-0.90). The experienced raters had an interrater agreement of 0.73 (0.56-0.90), while the less experienced raters had an agreement of 0.76, (0.28-1.24). Conclusion A substantial interrater agreement was found for the Copenhagen triage algorithm.
Databáze: OpenAIRE