Entrustable professional activities (EPAs) have become a cornerstone of assessment in competency-based medical education (CBME). Increasingly, EPAs are being adopted that do not conform to EPA standards. This study aimed to develop and validate a scoring rubric to evaluate EPAs for alignment with their purpose, and to identify substandard EPAs.Method
The EQual rubric was developed and revised by a team of education scholars with expertise in EPAs. It was then applied by four residency program directors/CBME leads (PDs) and four nonclinician support staff to 31 stage-specific EPAs developed for internal medicine in the Royal College of Physicians and Surgeons of Canada’s Competency by Design framework. Results were analyzed using a generalizability study to evaluate overall reliability, with the EPAs as the object of measurement. Item-level analysis was performed to determine reliability and discrimination value for each item. Scores from the PDs were also compared with decisions about revisions made independently by the education scholars group.Results
The EQual rubric demonstrated high reliability in the G-study with a phi-coefficient of 0.84 when applied by the PDs, and moderate reliability when applied by the support staff at 0.67. Item-level analysis identified three items that performed poorly with low item discrimination and low interrater reliability indices. Scores from support staff only moderately correlated with PDs. Using the preestablished cut score, PDs identified 9 of 10 EPAs deemed to require major revision.Conclusions
EQual rubric scores reliably measured alignment of EPAs with literature-described standards. Further, its application accurately identified EPAs requiring major revisions.