File
L. Hidalgo1, R. Marinas1, C. Salmon1, T. Godoy-Bobbio1, L. Pilgrim1, M. Wallace1
1University of St. Augustine, College of Rehabilitative Sciences - DPT Program, Coral Gables, United States
Background: Accreditation criteria mandate the evaluation of student technical skills, which is usually accomplished in a face to face format. In response to the Covid-19 pandemic, the DPT program transitioned course lecture and laboratory components including testing, to a remote format using an online learning management system. The transition highlighted the emerging need for research surrounding virtual examination of technical skills and faculty rating consistency utilizing remote testing formats.
Purpose: This study seeks to investigate the impact of faculty rater consistency and virtual practical assessment method on student performance.
Methods: The two virtual practical assessment methods used were:
(1) Virtual Skills Performance Assessments (VSPA) which required the students to simulate a case-based face to face patient encounter, or
(2) Virtual Oral Skills Assessment (VOSA) which simulated a case-based scenario with the student providing only verbal responses.
Faculty utilized check-list rubrics based on Miller’s Pyramid of Assessment to evaluate students’ virtual practical performances. Assigned faculty tester was identified by number to allow a check of consistency in grading. A convenience sample of 623 individual student scores from DPT students across the curriculum, (VSPA n = 421 and VOSA n = 202) was used. Quantitative data for student virtual practical scores, previous semester traditional practical scores, assigned faculty tester and method of practical assessment were analyzed using Independent T test and One Way Anova.
(1) Virtual Skills Performance Assessments (VSPA) which required the students to simulate a case-based face to face patient encounter, or
(2) Virtual Oral Skills Assessment (VOSA) which simulated a case-based scenario with the student providing only verbal responses.
Faculty utilized check-list rubrics based on Miller’s Pyramid of Assessment to evaluate students’ virtual practical performances. Assigned faculty tester was identified by number to allow a check of consistency in grading. A convenience sample of 623 individual student scores from DPT students across the curriculum, (VSPA n = 421 and VOSA n = 202) was used. Quantitative data for student virtual practical scores, previous semester traditional practical scores, assigned faculty tester and method of practical assessment were analyzed using Independent T test and One Way Anova.
Results: Data analysis showed that there was no difference between student scores for virtual practical assessments when comparing VSPA (M=94.5, SD=6.1) to VOSA (M=96.4, SD=4.8), p>.001, Cohen’s d=0.3. When comparing virtual practical assessments (M =94.0, SD = 8.6) to traditional face to face skills assessment (M=94.1, SD=2.3), there was no difference in student performance (p=0.79). Overall, the students’ scores earned from the faculty raters were consistent when compared to traditional face to face practicals. Faculty rating consistency analysis revealed some differences in rating the students’ virtual practical skills for 6 of the 13 courses. One course in the first-year and five courses in the second-year of the curriculum had significant differences in faculty rating of student virtual skills performances (p=0.018, p=0.001, p=0.045, p=0.013, p=0.004, p=0.001).
Conclusion(s): Student performance during virtual practical assessments appears to be consistent irrespective of the method selected. Faculty rating of students’ virtual skills performance was more consistent in the first year of the DPT curriculum, with more variability in rating for the program’s second-year courses. Even with the differences in faculty rating, student scores were consistent when compared to face to face scores, indicating that, virtual skills practicals may be an acceptable option for DPT programs.
Implications: The recent Coronavirus 2019 (COVID-19) pandemic has increased the need for innovative virtual methods for testing technical skills taught in physical therapy programs. Knowing if virtual practical exams are valid, educators will be armed with evidence to support the use of virtual assessments as an alternative to face to face testing. This study will be significant in helping to show that effective faculty rating and students’ performance of virtual technical skills is possible.
Funding, acknowledgements: None
Keywords: Covid-19, Virtual Assessment, Technical skills
Topic: Education: methods of teaching & learning
Did this work require ethics approval? Yes
Institution: University of St Augustine for Health Sciences
Committee: IRB
Ethics number: UR-0423-358
All authors, affiliations and abstracts have been published as submitted.