Comparing student cohorts between years in first-year chemistry assessments

Authors

  • Jacob Rhys Marchant The University of Adelaide
  • Simon Pyke The University of Adelaide
  • Natalie Williamson The University of Adelaide

Keywords:

assessment, student ability, student cohort

Abstract

BACKGROUND It is not uncommon to hear the sentiment that student cohorts are less capable than the cohorts that came before them. However, the evidence for this sentiment is severely lacking, as it is challenging to compare student cohorts from different years due to changes in how the different cohorts of students were assessed and changes to the overall structure and delivery of the courses across years. The University of Adelaide has a large bank of multiple-choice (MCQ) assessment results from assessments undertaken in its four first-year chemistry courses that can be used to address this issue, as most of the MCQ items used within the assessments have not changed over time. AIMS By using the results of MCQ assessments over multiple years this research aims to determine if there has been a significant change in the average ability of student cohorts from different years. DESIGN AND METHODS Using a combination of item stacking and Rasch analysis it is possible to compare the difficulty of the MCQ items and the ability of the students over multiple years by including them on the same relative scale. This then allows for the item difficulties to be compared to ensure that the items are performing the same way each year that they are used. If the item difficulties are not significantly different then any changes in assessment results must be due to changes in the ability of the student cohort(s), and so the ability of the students can then be compared through statistical analysis to determine if there are any significant changes over multiple years. RESULTS By comparing the student results over a period of four years (2012 - 2015) in four different first-year chemistry courses it was observed that none of the MCQ items that were used over multiple years showed significant differences in their difficulty, so any changes in assessment results must be a result of changes in the ability of the student cohort(s). It was seen that in two of the four courses, the 2012 cohort showed significant differences to the other yearly cohorts. However, one of these cohorts showed significantly higher student ability, while the other cohort showed significantly lower student ability. No other instances of a significant ability difference between the cohorts was observed in this analysis. CONCLUSIONS Only two of the sixteen cohorts analysed showed a significant difference between their average ability and the average ability of the cohorts from other years participating in the same course. In addition the two cohorts that did show significance differences (both from 2012) did not follow the same trend as each other, meaning it is unreasonable to infer any potential trends from these results alone. At this point, the results of this analysis suggest that the ability of the students has not significantly changed between years. Further work using MCQ assessment results from additional cohorts will be carried out to investigate whether the 2012 cohort results are in fact anomalous.

Downloads

Published

2019-09-26