Surgical competency requires sound clinical judgment, a systematic diagnostic approach, and integration of a wide variety of nontechnical skills. This more complex aspect of clinician development has traditionally been difficult to measure through standard assessment methods.OBJECTIVE
This study was conducted to use the Clinical Practice Instrument (CPI) to measure nontechnical diagnostic and management skills during otolaryngology residency training; to determine whether there is demonstrable change in these skills between residents who are in postgraduate years (PGYs) 2, 4, and 5; and to evaluate whether results vary according to subspecialty topic or method of administration.DESIGN, SETTING, AND PARTICIPANTS
Prospective study using the CPI, an instrument with previously established internal consistency, reproducibility, interrater reliability, discriminant validity, and responsiveness to change, in an otolaryngology residency training program. The CPI was used to evaluate progression in residents' ability to evaluate, diagnose, and manage case-based clinical scenarios. A total of 248 evaluations were performed in 45 otolaryngology resident trainees at regular intervals. Analysis of variance with nesting and postestimation pairwise comparisons were used to evaluate total and domain scores according to training level, subspecialty topic, and method of administration.INTERVENTIONS
Longitudinal residency educational initiative.MAIN OUTCOMES AND MEASURES
Assessment with the CPI during PGYs 2, 4, and 5 of residency.RESULTS
Among the 45 otolaryngology residents (248 CPI administrations), there were a mean (SD) of 5 (3) administrations (range, 1-4) during their training. Total scores were significantly different among PGY levels of training, with lower scores seen in the PGY-2 level (44 ) compared with the PGY-4 (64 ) or PGY-5 level (69 ) (P < .001). Domain scores related to information gathering and organizational skills were acquired earlier in training, while knowledge base and clinical judgment improved later in residency. Trainees scored higher in general otolaryngology (mean [SD], 72 ) than in subspecialties (range, 55 , P = .003, to 56 , P < .001). Neither administering the examination with an electronic scoring system, rather than a paper-based scoring system, nor the calendar year of administration affected these results.CONCLUSIONS AND RELEVANCE
Standardized interval evaluation with the CPI demonstrates improvement in qualitative diagnostic and management capabilities as PGY levels advance.
This study evaluates use of the Clinical Practice Instrument, a validated tool to assess acquisition of nontechnical aspects of clinical practice ability to measure key diagnostic and treatment skills during residency training.