Disrupting the past paper pandemic – developing new question banks


  • Diana Warren School of Mathematics and Statistics, University of Sydney, Sydney, NSW, 2006, Australia


academic integrity, past papers, multiple-choice question banks, differentiation


One way of maintaining the academic standard and integrity of successive deliveries of a unit is to re-use a confidential multiple-choice question bank in the final exam. Such questions can be carefully workshopped to ensure a breadth of learning outcomes is being assessed, without repetition, at different performance levels. By necessity, the COVID-19 pandemic caused higher education institutions to quickly pivot to online exams, causing a pandemic of past papers being released on multiple forums, including student discussion boards and contract cheating companies. This included previously well-tested, confidential question banks. As a result, academics needed to rapidly produce new collections of questions, ideally different in form to those now available online.

Our study focuses on exam data from DATA1001 (Foundations of Data Science) at The University of Sydney, with an annual cohort of more than 2000 students. We investigate patterns in the new multiple-choice question bank. What was the actual performance level compared to the expected performance level? What types of questions best differentiated between students of different abilities? What emerges is interesting findings about what students mastered, and what they struggled to learn, with implications for developing and reviewing multiple-choice questions.