Evaluating the Performance of the Academic Coordinator/Director of Clinical Education in Physical Therapist Education: Developing a Tool to Solicit Input From Center Coordinators of Clinical Education and Clinical Instructors

    loading  Checking for direct PDF access through Ovid

Abstract

Background and Purpose.

The academic coordinators of clinical education/directors of clinical education (referred to in this article as DCEs) in professional (entry-level) physical therapist education programs have unique responsibilities that are not reflected in traditional faculty evaluation tools. In 2006, Buccieri and Brown published relevant criteria and identified clinical educators (CEs), as appropriate reviewers for a multisource assessment of the DCE. Clinical educators are inclusive of both center coordinators of clinical education (CCCEs) and clinical instructors (CIs). The purposes of this project were to develop a model assessment tool for CEs to evaluate DCEs and to examine the tool's content validity.

Method/Model Description and Evaluation.

A purposive sample of CEs representing diverse practice settings and geographic locations was selected for participation in an electronic survey. Ninetyone CEs completed the survey. More than 60% of the respondents indicated that they could provide moderate to significant information for each criteria in the categories of Administration and Teaching. Forty-five percent or less of the respondents indicated they could provide moderate to significant information regarding the Service category criteria. The second phase of the project tested the tool by sending two versions of the model tool (one including Service criteria, one omitting Service criteria) to CEs supervising students from 3 academic institutions (N = 591). The CEs were asked to complete the tool as an assessment of the DCEs from their respective academic programs and provide feedback to the principal investigator regarding the format and mechanics of using the tool.

Outcomes.

A model tool to solicit feedback from CEs for the performance evaluation of a DCE was developed based on agreement between CEs and DCEs on identified Administrative and Teaching criteria. Little meaningful information regarding Service performance was obtained from the CEs, suggesting that Service criteria should be omitted when soliciting evaluative input from CEs. Information from the trial use of the tool indicated that valuable Administration and Teaching feedback was received and could be tabulated for inclusion in the DCE faculty evaluation.

Discussion and Conclusion.

This assessment tool was found to have adequate content validity and can be used to gather CE input regarding the performance evaluation of the DCE. Information obtained may be included in promotion/tenure portfolios and used to guide professional development. Future research should consider establishing validity of criteria for students, program directors, and academic faculty input for a multisource assessment of the DCE.

Related Topics

    loading  Loading Related Articles