(Hypothesis) Using a group of Expert Educators, reliable peer review of clinical didactics can be provided to medical school faculty in an integrated curriculum.BACKGROUND:
Recognition of effective teaching and faculty development to promote excellence in teaching should be curricular goals. Peer review can be a challenging component of teaching evaluation to institute. We sought to evaluate the consistency of reviewers in a newly implemented peer review program.METHODS:
In an observational study of teaching in integrated clerkships, faculty leading didactic sessions were peer-reviewed by select faculty, “Expert Educators” (EEs). EEs reviewed sessions across all disciplines in the 3rd year clinical curriculum. EEs underwent faculty development training in peer observation and the giving of feedback, and used standard twelve item forms created using resources from the Stanford Faculty Development Program. Where possible, EEs were paired to provide a review on the same didactic session. We compared paired reviews using percent agreement as a means of assessing the consistency of feedback generated.RESULTS:
Twenty paired and 44 individual peer reviews were completed by EEs. Analysis of the paired reviews showed average agreement by case was 52% but rose to 87% when data were grouped by acceptable/not acceptable scores.DISCUSSION:
EEs have moderate agreement when reviewing the same didactic session and high agreement when elements were analyzed as acceptable/not acceptable. We plan to use taped lecture review by 3–4 EEs to continue assessment of rater reliability. Additional future directions include comparisons of peer reviews with reviews from curriculum leaders, students and faculty self-reflections.