Smart phone-based evaluation tools have been proposed to improve feedback in medical education settings, but few studies have examined outcomes among instructors.METHODS:
We performed a randomized controlled trial comparing medical instructor experience with two types of evaluations in the OB/GYN clerkship at the University of Washington School of Medicine. Eighteen teaching sites participated in the study and were randomized to smart phone-based QR or paper evaluations. Instructors, defined as attending or resident physicians involved in third year student education, completed a post-intervention survey with 10 Likert-type questions about the evaluation tool. We compared responses between the groups using χ2 tests.RESULTS:
Nineteen of 29 (65%) instructors completed the survey. Groups were similar in age, gender, level of training, level of student interaction, comfort with technology, and prior knowledge and use of QR. Compared to those using paper evaluations, instructors using QR evaluations were significantly more likely to agree that the evaluation tool was easy to understand (100% versus 43%, P=.013) and easy to navigate (82% versus 57%, P=.042), and that the evaluation tool was effective in providing feedback (75% versus 29%, P=.015). Evaluators using QR also felt more comfortable approaching students with the evaluation tool (92% versus 43%, P=.012). We found no differences in timeliness or frequency of feedback between the groups.DISCUSSION:
Instructors found the QR evaluation to be a superior alternative to the paper evaluation for providing feedback to medical students.