Videotaping Practical Examinations in Physical Therapist Education: Does It Foster Student Performance, Self-Assessment, Professionalism, and Improve Instructor Grading?

    loading  Checking for direct PDF access through Ovid

Abstract

Background and Purpose.

Reflection and self-assessment are keystones to professionalism and skill acquisition and are integral components of clinical practice. The purpose of this study was to explore the use of videotaping during student physical therapist (PT) practical examinations as a mechanism to promote self-assessment of skills and reflection on professional communication behaviors. Additionally, using a video to self-assess and improve student exam scores and instructor accuracy in scoring practical examinations live versus using a videotape performance was explored.

Participants.

Fifty-one PT students enrolled in either an introductory assessment course (n = 24) in the curriculum's second semester or an orthopedics course (n = 27) in the fourth semester.

Methods.

This repeated measures design had subjects in both cohorts randomly assigned to 1 of 2 groups: a video group or a non-video group. Groups were asked to self-assess clinical skill and professional behavior immediately following both a midterm and a final practical examination prior to receiving written feedback. All students were videotaped during exams, but only the video group utilized the tapes for self-assessment. All students were graded by the instructor during the exam on total practical score, professional score, and clinical skills score. Video group taped performances were then viewed and graded again by the instructor to compare live versus videotaped grading differences.

Data Analysis.

The practical exam scores between groups (video versus non-video) were compared over time (midterm and final) with a 2 × 2 mixed analysis of variance (ANOVA) for instructor live scores and student self-scores, respectively. A 2-tailed paired-sample t test was used to determine the effect of the instructor viewing methods (live versus videotaped) on practical exam scores. Changes in scores were calculated by subtracting the student's score value from the corresponding instructor's score value on both the midterm and the final. The change in difference between student self-perception of scoring and instructor perception of student performance on practical exam scores between groups (video versus nonvideo) over time was analyzed with a 2 × 2 mixed ANOVA.

Results.

For the instructor live scoring, there were no significant interaction effects for intervention (video or non-video feedback) or time. For student self-scoring, all students significantly improved between the midterm and final examination in total practical score and in the professional behavior score (P < .05). For the instructor approach of live scoring versus videotaped scoring, all 3 practical exam scores (total, professional, and clinical skills) were significantly lower after viewing a videotaped performance compared to live scoring at the final exam (P < .05). While there was a significant improvement in accuracy in all students regarding the professional behavior score from the midterm to the final (no significant improvement in total score or clinical skills score), no difference in accuracy between groups was reported (P < .05).

Discussion and Conclusion.

Students using videotapes to self-assess their practical performance, when compared to selfassessing without viewing a video, did not preferentially demonstrate a greater improvement in final exam scores, professional behaviors, or self-scoring accuracy compared with instructor scoring. Therefore, videotaping of practical examinations may not be a beneficial educational tool. Outlining expectations and providing a detailed rubric before practical examinations, along with ample feedback after an examination, may be of greatest benefit to student learning outcomes.

Related Topics

    loading  Loading Related Articles