A study of Gleason score interpretation in different groups of UK pathologists; techniques for improving reproducibility
Autor: | J Melia, M. C. Parkinson, Rodolfo Montironi, David J. Griffiths, R Moseley, Michael Waller, L J McWilliam, Sue Moss, R Y Ball, Patricia Harnden, Ken Grigor, M Jarmulowicz |
---|---|
Rok vydání: | 2006 |
Předmět: |
Observer Variation
Gynecology Reproducibility medicine.medical_specialty Pathology Clinical Histology business.industry Second opinion Decision tree Gleason grading Reproducibility of Results General Medicine Score interpretation Severity of Illness Index United Kingdom Pathology and Forensic Medicine Test (assessment) Neoplasms Humans Medicine Neoplasm staging Medical physics business Quality assurance Neoplasm Staging |
Zdroj: | Histopathology. 48:655-662 |
ISSN: | 1365-2559 0309-0167 |
DOI: | 10.1111/j.1365-2559.2006.02394.x |
Popis: | Aims: To test the effectiveness of a teaching resource (a decision tree with diagnostic criteria based on published literature) in improving the proficiency of Gleason grading of prostatic cancer by general pathologists. Methods: A decision tree with diagnostic criteria was developed by a panel of urological pathologists during a reproducibility study. Twenty-four general histopathologists tested this teaching resource. Twenty slides were selected to include a range of Gleason score groups 2–4, 5–6, 7 and 8–10. Interobserver agreement was studied before and after a presentation of the decision tree and criteria. The results were compared with those of the panel of urological pathologists. Results: Before the teaching session, 83% of readings agreed within ± 1 of the panel's consensus scores. Interobserver agreement was low (κ = 0.33) compared with that for the panel (κ = 0.62). After the presentation, 90% of readings agreed within ± 1 of the panel's consensus scores and interobserver agreement amongst the pathologists increased to κ = 0.41. Most improvement in agreement was seen for the Gleason score group 5–6. Conclusions: The lower level of agreement among general pathologists highlights the need to improve observer reproducibility. Improvement associated with a single training session is likely to be limited. Additional strategies include external quality assurance and second opinion within cancer networks. |
Databáze: | OpenAIRE |
Externí odkaz: |