An articulated approach to the development and evaluation of automated feedback for online MCQ quizzes in Human Biology
Abstract
This paper describes an articulated programme of development and evaluation of automatically-presented explanatory feedback comments for online, enriched-multiple choice style quizzes in Human Biology for first year university courses. The degree of articulation of the separate components of the programme arose almost unintentionally from the inclusion of common sets of demographic questions in several of the components of the work, and from continuity of logon identities, but proved to be a powerful means of reaching an understanding of the dynamics of student engagement with the online learning process and of the effectiveness of the product we were testing. In particular, links were established between expectations of academic performance and the amount of paid employment in which students were engaged, and between expected and achieved levels of performance. Students who expected lower levels of performance at the outset were also less convinced of the potential of feedback to help them with their studies. Analysis of the patterns of use of the online test revealed a serious disadvantage to working students of current accessibility to online summative assessments, and that the standard duration of the summative tests was approximately three times the preferred online work span of the younger students. ‘Dose’ and ‘decay’-graded selective improvements in end of semester assessments in the topics covered by the feedback comments could be demonstrated.Downloads
Published
2012-10-08
Issue
Section
Refereed Papers