Data driven decision making in chemistry first year subjects

Authors

  • Simon B Bedford University of Wollongong
  • Jennifer Heath

Abstract

ABSTRACT Background: Analytics is not a new area of endeavour with many industries and other professions are well ahead of the education sector in the uptake of advanced analytics methods and tools (Abdous, He, & Yen, 2012; Dziuban, Moskal, Cavanagh, & Watts, 2012). Wagner and Ice (2012) describe higher education as being on the early side of the analytics adoption curve when compared to retail, telecommunications, financial services and manufacturing. Analytics is often used in higher education institutions to identify and also predict individual students who may be ‘at risk’ (Fritz, 2011). Aims: The primary aim was the deployment of information technologies that provide learning analytic data on students enrolled in large chemistry first year subjects. These data contain valuable learning progression and experience information to academics, part-time teaching staff and professional staff on students engagement, motivation and progression in real time so that suitable interventions can be made on students at risk of failing the subject. Design and methods: Learning analytics (LA) is the measurement, collection, analysis and reporting of data about learners and their contexts, for purposes of understanding and optimising learning and the environments in which it occurs. No new data has been captured to get learning analytics started at UOW – existing information is being utilised from point‐of‐service information systems (PASS, Library and student management systems) and the Moodle grade book on the subject sites. As students make use of the subject Moodle sites, information is automatically gathered about learning resource use, time on task, assessment item activities and student involvement in online forums. Each student leaves ‘electronic breadcrumbs’ within these systems as they go about their student journey and these are consolidated in the learning analytics data warehouse. Learning Analytics then aims to draw data from these diverse systems to provide actionable intelligence visualisation for staff to make decisions on. Results: The learning analytics have been deployed in two first year subjects, which have a combined cohort of some 700 students and contain some 50 activities, assessments and resources to monitor. The study is approximately at the half way point, covering so far 13 weeks of teaching and with five visualisation reports having been created. The full study will have been completed by the time of the conference presentation, although not all data will have been analysed by that stage. Key findings so far are: • Bringing together information from multiple data sources to provide a more holistic picture of student engagement and activity within a subject is useful in broad terms but caution is required when interpreting data to avoid making assumptions, and drawing false conclusions. A mutual understanding from both learning analytics staff and academic staff is required in the decision making process. • Analytical insights can inform more tailored and focused student interventions that bring about a positive change in student resource utilisation and performance on assessment tasks. For example, this can reveal the value added of having or not having peer assisted study sessions (PASS) within a subject and developing a culture that uses data in making instructional curriculum design changes. Conclusions: The study so far has shown that learning analytics has been able identify a group of students early on in the semester at risk of failing, that interventions have been successful in preventing this, but that data noise is an issue that can obscure others whose performance drops off towards the end of the semester. REFERENCES 1. Buerck, J. P. (2014). A Resource-Constrained Approach to Implementing Analytics in an Institution of Higher Education: An Experience Report. Journal of Learning Analytics, 1(1), 129-139 2. Heath, J. (2014). Contemporary Privacy Theory Contribution to Learning Analytics Journal of Learning Analytics, 1(1), 140-149. 3. Macfadyen, L. P., & Dawson, S. (2012). Numbers Are Not Enough. Why e-Learning Analytics Failed to Inform an Institutional Strategic Plan. Educational Technology & Society, 15(3), 149-163.

Author Biography

  • Simon B Bedford, University of Wollongong
    Senior Lecturer

Downloads

Published

2015-08-29