A Medical Science Educator Article Review From Dr. Rachel Porter

This month the IAMSE publications committee review is taken from the article titled Rapid Feedback: Assessing Pre-clinical Teaching in the Era of Online Learning (15 June 2022) by Daniel Walden, Meagan Rawls, Sally A. Santen, Moshe Feldman, Anna Vinnikova & Alan Dow. 

Having recently embraced novel virtual learning methods, the authors sought to enhance the evaluation of their implementation through the use of rapid student feedback. Recognizing the limitations of summative evaluations, they augmented their required end-of-course feedback model by adding a less formal, optional survey after each learning session. Students provided their responses to three brief survey items deployed via a web link, QR code, and learning platform, within 48 hours of each session. This collected a Likert scale rating along with free text comments that were then collated and emailed to faculty. The investigators then studied both instructor and student perceptions of the rapid feedback process.

The approach was implemented for 49 of the 50 sessions within a second-year MD program course. The investigators analyzed the number of forms submitted by students, as well as their submission method (QR code or LMS link) and timing, and the ratings for passive versus active learning sessions. They purposely did not analyze qualitative comments from the students but rather focused on measuring faculty perceptions after having received the quantitative ratings and qualitative feedback from students in the rapid format.

Though the student response rates were fairly low (18%), the reactions to the process – from students and faculty – were generally positive. 91% of participating students reported feeling more involved in the feedback process, 70% noted that changes were made in response to their feedback, and only 28% reported feeling that the frequent ask for feedback was burdensome. Faculty participation was higher, with 68% of course instructors responding to the survey. Of those, 42% felt the rapid feedback was more helpful and 50% agreed it was more specific compared to end-of-course feedback. Two-thirds (67%) of the group reported being able to implement feedback in ways that improved subsequent sessions, and 69% indicated they would like to see rapid feedback continued in the future.

Though this response suggests positive faculty reception, the authors did note that some of the instructors who taught more sessions within the course commented that the mere possibility of reading negative student comments was “terrifying.” In their discussion, they recommended training for students in effective feedback, to curtail overly specific, hurtful, or unprofessional comments. Other drawbacks were noted pertaining to low student response rates and the challenges of creating learning environments conducive to effective feedback.

This article was a stimulating read and timely given the relationship of rapid feedback to the implementation of novel teaching methods related to online learning. The need for effective and actionable feedback on teaching and learning is a common one across educational contexts and can be particularly challenging in the health professions. While summative end-of-course surveys often provide the information needed for institutional and accreditation requirements, they can lack specificity, and their timing precludes immediate adjustments in response. Rapid feedback is a promising strategy and this article presents a study that illustrates both the promise and the inherent drawbacks. The authors provide thoughtful discussion and practical suggestions for those of us tackling similar challenges.

Rachel Porter, PhD
Senior Education Strategist
Duke University Physician Assistant Program