“I try my very best and then I send it to the wizards, who make up numbers”

Science students’ perceptions of (in)effective assessment and feedback practices

Authors

  • Christine Chinchen School of Chemistry, University of New South Wales, Sydney, NSW 2052, Australia
  • Kate Jackson School of Physics, University of New South Wales, Sydney, NSW 2052, Australia
  • Jennifer Stansby School of Chemistry, University of New South Wales, Sydney, NSW 2052, Australia
  • Nirmani Wijenayake School of Biotechnology and Biomolecular Sciences, University of New South Wales, Sydney, NSW 2052, Australia
  • Siobhán S. Wills School of Chemistry, University of New South Wales, Sydney, NSW 2052, Australia

Keywords:

assessment design, feedback practices, student perceptions

Abstract

Assessment and feedback are key concerns for tertiary students, as evidenced by university and national student experience surveys (QILT, 2020). While these large surveys convey the general student sentiment, literature recommends approaches other than surveys to deepen understanding of students' experiences in individual faculties, courses etc. (Berk, 2018). This is particularly important when planning any changes in assessment practices.

Following on from an initial study into students’ assessment and feedback literacy (Wills et al., 2022), we present the second stage of our project aiming to understand students’ experiences and perceptions of assessment and feedback at the University of New South Wales.

From a thematic analysis of semi-structured student interviews, we present several case studies of what science students consider to be effective assessment and feedback in their program.

Some identified themes such as linked assessments, worked answers, and annotated submissions, were found to be effective practices across board. However, for other themes such as the usefulness of formative assessment, rubrics, and positive feedback, students were not in agreement. Resoundingly, students condemned the lack of closure around final exams.

These and other findings will be presented before student suggestions for improvement are discussed, as well as looking to a future assessment co-design with students.

Feedback on final exams:

“…about final exams, it it's like a black box. You know, you answer and you might get, I don't know, 70%. But that means there's 30% you've got wrong and you still want to know why that is…”

Effectiveness of formative assessment:

“I think that often, they're just one or two questions that are about a detail that was unimportant. And the lecture isn't… the lecture content isn’t tested properly.”

REFERENCES

Berk, R. (2018). Beyond Student Rating: Fourteen Other Sources of Evidence to Evaluate Teaching. In E. Roger & H. Elaine (Eds.), Handbook of quality assurance for university teaching (pp. 317–344). London: Routledge.

QILT. (2020). Student Experience Survey. Social Research Centre. https://www.qilt.edu.au/surveys/student-experience-survey-(ses)#report

Wills, S.S, Jackson, K. & Wijenayake, N. (2022). On the same page: Science students' assessment literacy. In Spagnoli, D. & Yeung, A., Proceedings of The Australian Conference on Science and Mathematics Education (pp.76). Perth, Western Australia

Downloads

Published

2023-08-24