Return To Abstract Listing

Scientific Session 11 — SS11: Efficacy/Administration/Informatics - Education

Tuesday, May 7, 2019

Abstracts 1201-2871



1878. Diagnostic Radiology Residency Assessment Tools: A Systematic Review

Tu W*,  Hibbert R,  Kontolemos M,  Dang W,  McInnes M. The University of Ottawa/The Ottawa Hospital, Ottawa, Canada

Address correspondence to W. Tu (wtu012@uottawa.ca)

Objective: The multifaceted learning in diagnostic radiology residency requires a variety of assessment methods. National accreditation bodies are encouraging a move from a time-based to a competency-based residency model. This move towards a competency-based curriculum is associated with a need for more tailored assessment tools specific to the needs of resident training. The breadth, quality, and availability of assessment tools for radiology residents have not been formally examined. The purpose of this study is to perform a scoping systematic review to identify which assessment tools are available for radiology resident training and to assess the validity of these tools.

Materials and Methods: A literature search was conducted through multiple databases including Medline, Embase, Cochrane Trials, Database of Abstracts of Reviews of Effects, and Cochrane Systematic Reviews with the assistance of an experienced hospital librarian. Inclusion criteria: any tool used in assessment of radiology resident competence. A search of the gray literature including the major radiology conferences and education websites for abstracts and online resources for resident training was also performed. Initial screening of search results and data extraction was done by two independent evaluators, with disagreement resolved by consensus discussion with a third reviewer. Data extracted included characteristics of the residents, evaluators and specifics of each evaluation tool. The validity of each tool was examined with a customized tool developed with the assistance of an education researcher using the Standards for Educational and Psychological Testing, with five categories of validity: content, response process, internal structure, relations to other variables, and consequences. Interrater agreement was calculated with weighted kappa.

Results: The initial search returned 445 articles; 50 articles met inclusion criteria. Evaluation tool characteristics: 15 (30%) overall radiology knowledge, eight (16%) subspecialty competencies, seven (14%) competency on call, seven (14%) reporting skills, three (6%) procedural competence, three (6%) communication skills, three (6%) perception skills, two (4%) professionalism, and two (4%) end of rotation evaluations. These characteristics were used to evaluate 14188 residents by 326 evaluators at institutions in North America, Europe, and Asia. In terms of validity, a majority of the articles (56%) did not assess validity at all, and 14% presented evidence from multiple domains. In terms of singular categories of validity, generalizability was the most common domain assessed (12%).

Conclusion: We identify 50 evaluation tools for radiology resident training covering a broad scope of areas. However, the validity of these tools is quite variable; some have not been formally assessed in any of the validity domains, and others have been evaluated in a variable manner. Validation of these tools to ensure they are applicable prior to implementation in a competency-based resident education curriculum would be optimal.