Learning practices through recursive questionnaires

Authors

  • Charlie V. Sarmiento Physics Department, Federal University of Rio de Janeiro, Rio de Janeiro, Rio de Janeiro 21941-909, Brazil https://orcid.org/0000-0002-5312-0503
  • Germano M. Penello Physics Department, Federal University of Rio de Janeiro, Rio de Janeiro, Rio de Janeiro 21941-909, Brazil
  • Lucas Sigaud Physics Department, Federal University of Fluminense, Niteroi, Rio de Janeiro 24210-346, Brazil

Keywords:

Online Learning, Teaching and Learning Strategies, Assessment Methodologies

Abstract

Distance education or distance learning (DL) in undergraduate courses is growing considerably in the last decades (Holmes & Reid, 2017) and got even more attention due to the COVID pandemic (Crawford et al., 2020; Baggaley, 2020). Students pursuing distance learning have often suffered from isolation and lack of a learning community, resulting in high dropout rates. In general, students’ engagement depends strongly on the evaluation methods (Holmes, 2018), and they define how students manage their time in study.

Introduction to Physical Sciences 1 (ICF1) is the subject in which this work was developed in a DL course with over 1500 students per semester at CEDERJ (Distance Education Center of the State of Rio de Janeiro). In this work, we modified the method of evaluation of two optional assessments (OA) that was given to the students, whose main goal is to promote the practice, through exercises, of two fundamental contents in physics which students often have doubts and misconceptions from high school: vectors (OA1) and Newton's laws (OA2).

This modification was carried out by implementing a recursive formative online assessment that students can answer as many times as they want until a specified due date. The assessment avoids memorizing the answers to the questions by generating new numerical values for each question and randomly mixing their order in every new attempt. After finishing the questionnaire, students receive immediate feedback to correct their mistakes and try to achieve a better score, in such a way that mistakes can be beneficially used as part of their learning process. Each attempt is recorded and all students’ progress is analyzed using a computational Python code. With this approach, the professor can identify and eventually interact with students who need a higher level of attention before the final exams.

The former OAs were not clearly evaluating the students and helping them to better understand the required subjects. In the old format, comparisons and copying of questions could be used by the students in such a way that created an illusory impression on them that they were understanding the underlying concepts. Thus, there was a lack of correlation between the scores in the OAs and the final scores of the students. Our analysis after the implementation of the recursive questionnaires, shows that students with high scores in the OA have more chances to pass the course, as expected from an exam that correctly evaluates the students’ performance. This methodology is a valuable asset for the students’ learning process, regardless of whether they are in a DL course or a regular face-to-face course, and for the professors to identify patterns that can foresee students’ difficulties with their learning progress, allowing to address specific students in need even in classes with a large number of students.

REFERENCES

Baggaley, J. (2020). Educational distancing. Distance Education41(4), 582-588. https://doi.org/10.1080/01587919.2020.1821609

Crawford, J., Butler-Henderson, K., Rudolph, J., Malkawi, B., Glowatz, M., Burton, R., Magni, P., & Lam, S. (2020). COVID-19: 20 countries' higher education intra-period digital pedagogy responses. Journal of Applied Learning & Teaching3(1), 1-20. http://dx.doi.org/10.37074/jalt.2020.3.1.7

Holmes, C. M. & Reid, C. (2017). A comparison study of on-campus and online learning outcomes for a research methods course. The Journal of Counselor Preparation and Supervision9(2), 15. http://dx.doi.org/ 10.7729/92.1182

Holmes, N. (2018). Engaging with assessment: Increasing student engagement through continuous assessment. Active Learning in Higher Education, 19(1), 23-34. https://doi.org/10.1177/1469787417723230

Downloads

Published

2022-11-25