Evaluating the Performance of Medical Educators: A Novel Analysis Tool to Demonstrate the Quality and Impact of Educational Activities

    loading  Checking for direct PDF access through Ovid

Abstract

Purpose

Traditional promotion standards rely heavily on quantification of research grants and publications in the curriculum vitae. The promotion and retention of educators is challenged by the lack of accepted standards to evaluate the depth, breadth, quality, and impact of educational activities. The authors sought to develop a practical analysis tool for the evaluation of educator portfolios (EPs), based on measurable outcomes that allow reproducible analysis of the quality and impact of educational activities.

Method

The authors, 10 veteran educators and an external expert evaluator, used a scholarly, iterative consensus-building process to develop the tool and test it using real EPs from educational scholars who followed an EP template. They revised the template in parallel with the analysis tool to ensure that EP data enabled valid and reliable evaluation. The authors created the EP template and analysis tool for scholar and program evaluation in the Educational Scholars Program, a three-year national certification program of the Academic Pediatric Association.

Results

The analysis tool combines 18 quantitative and 25 qualitative items, with specifications, for objective evaluation of educational activities and scholarship.

Conclusions

The authors offer this comprehensive, yet practical tool as a method to enhance opportunities for faculty promotions and advancement, based on well-defined and documented educational outcome measures. It is relevant for clinical educators across disciplines and across institutions. Future studies will test the interrater reliability of the tool, using data from EPs written using the revised template.

Related Topics

    loading  Loading Related Articles