Beginning in 2014, US neurology residency programs were required to report each trainee's educational progression within 29 neurology Milestone competency domains. Trainee assessment systems will need to be adapted to inform these requirements. The primary aims of this study were to validate neurology resident assessment content using observable practice activities (OPAs) and to develop assessment formats easily translated to the Neurology Milestones.Methods:
A modified Delphi technique was used to establish consensus perceptions of importance of 73 neurology OPAs among neurology educators and trainees at 3 neurology residency programs. A content validity score (CVS) was derived for each neurology OPA, with scores ≥4.0 determined in advance to indicate sufficient content validity.Results:
The mean CVS for all OPAs was 4.4 (range 3.5–5.0). Fifty-seven (78%) OPAs had a CVS ≥4.0, leaving 16 (22%) below the pre-established threshold for content validity. Trainees assigned a higher importance to individual OPAs (mean CVS 4.6) compared to faculty (mean 4.4, p = 0.016), but the effect size was small (η2 = 0.10). There was no demonstrated effect of length of education experience on perceived importance of neurology OPAs (p = 0.23). Two sample resident assessment formats were developed, one using neurology OPAs alone and another using a combination of neurology OPAs and the Neurology Milestones.Conclusions:
This study provides neurology training programs with content validity evidence for items to include in resident assessments, and sample assessment formats that directly translate to the Neurology Milestones. Length of education experience has little effect on perceptions of neurology OPA importance.