Feedback Analytics - Measuring the impact of video feedback on student learning

Jack Tsang-Hsing Wang, Racehl Chen, Petr Worthy


Watching videos has become an integral part of Higher Education, with students routinely interacting with YouTube, lecture recordings, and Massively Open Online Courses (MOOCs) as part of their blended-learning experiences. Short videos that can be easily accessed and repeatedly viewed have also been crucial to the success of educational initiatives such as the Khan Academy, and this project adopted online videos as a key mechanism of feedback delivery. The use of instructor video feedback has the potential to provide multisensory information on student performance through a personalized communication channel that students readily engage with.

This project developed and deployed vMarks, a software application designed to facilitate the delivery of instructor feedback in the form of short personalized videos accessible online for students. We aimed to investigate the impact of video feedback on student learning, specifically by reviewing student interaction with the videos through learning analytics (feedback analytics), and their perceptions towards effective feedback practices in Higher Education.

Design and methods
Following the completion of a practical laboratory examination, students enrolled in the 2014 and 2015 offerings of MICR2000 (‘Microbiology and Immunology’) at The University of Queensland (UQ) were given video feedback for this assessment task. Instructors recorded a short 1-minute video highlighting each student’s experimental outcomes and laboratory performance, which was emailed to students directly through vMarks. Student interaction with vMarks was measured through feedback analytics (frequency of logins, video views, pausing and scrubbing patterns), and their perceptions towards feedback in higher education collated through surveys (n=60) and one-on-one interviews (n=16). Informed consent was obtained in all cases (Project #2012000755 approved by the UQ Behavioural Social Science Ethical Review Committee).

Over 50% of the 2014 and 2015 MICR2000 students who logged into vMarks watched their feedback video once without pausing or scrubbing; the remainder of the students sat through multiple video viewings and paused/scrubbed at least once through the 1-minute video. Video timestamps with high pausing/scrubbing frequency correlated to specific experimental skills assessed in the practical examination. Pre and post survey analysis revealed an 17% increase in student responses reporting video feedback as the most helpful form of feedback, which was further supported by interviews where 27% of students found video feedback to be most useful for their learning.

Video analytics can be used to highlight common student misconceptions and facilitate the refinement of the feedback/feed-forward loop across multiple iterations of assessment. The accessibility of video feedback presents a powerful tool for instructors to expand their feedback practices.


Video feedback; Feedback-analytics; learning-analytics

Full Text: