Popis: |
Rationale and objectives We surveyed program directors to determine current radiology program practices in evaluating their residents, faculty, and program. Materials and methods In January 2003, a 52-item Web-based survey was made available to program directors of accredited core radiology programs. Responses to the items were tabulated to determine relative frequency distribution. Two-tailed Pearson chi-square tests were used to compare proportions and assess the association between variables. Results A total of 99 (52%) of 192 program directors responded. Programs were largely in compliance with Accreditation Council for Graduate Medical Education (ACGME) requirements. Noncompliance was related to the requirements to evaluate residents at least four times per year in at least 22 (22.2%) of 99 programs and annually evaluate the program in 20 (20.2%) of 99 programs. New program directors ( P = .03). Programs that used this form, compared with those that didn't, were more likely to evaluate resident competence in systems-based practice (88.5% versus 44.0%, P = .001). Being a program director for 1 or more years versus less than 1 year was associated with using a computerized evaluation system (35.8% versus 11.8%, P = .05). Conclusion In general, there is a high degree of compliance among radiology programs in meeting ACGME evaluation requirements. However, some programs do not comply with requirements for frequency of resident evaluation or annual program evaluation. The percentage of new program directors is high and related to not using or knowing about useful evaluation resources. Use of computerized evaluation systems, which have the potential to decrease the work associated with evaluations and provide more dependable and reliable data, is minimal. |