This study was performed to identify factors that impact student performance on a web-based objective structured clinical evaluation (OSCE) that was developed to improve the evaluation process of students who complete a fourth-year surgical clerkship in trauma-critical care.Methods
We created a multiple-choice OSCE with commercially available software. Clinical cases were developed for incorporation into 7 quizzes that were assembled to appear as 1 examination. Students used intensive care unit flow sheets to review data, to develop a systems-based problem list and differential diagnoses, and to produce treatment recommendations.Results
No difference was noted in a comparison of the mean scores that were achieved by students on a previous paper (essay format) OSCE and the new web OSCE. There was a correlation of student performance on the web OSCE to the National Board of Medical Examiners (NBME) subject examination that had been completed the previous year (r = 0.60; P < .0001). Performance on the NBME subject examination was the only independent factor that affected reporter, interpreter, and manager skills that were assessed by the OSCE (P < .01).Conclusion
Implementation of a web OSCE resulted in similar performance of the class as compared with performance on the previous paper OSCE. Correlation of student achievement on the web OSCE to the NBME subject examination supported the construct validity of this institutional examination beyond the areas of face and content validity in which OSCEs may excel.