BioLogic: new software to improve explanatory skills and facilitate feedback and assessment in large STEM classes
Keywords:
explanations , assessment, technology enhanced learning, student experienceAbstract
The ability to explain scientific concepts in clear, logical language is a crucial attribute for STEM students (Alameh et al., 2022; Braaten & Windschitl, 2011). Yet, software tools to develop these skills remain limited. Written explanations to Short-Answer Questions are valuable for assessing reasoning and understanding but often disadvantage non-native English speakers and impose a heavy marking burden. Multiple-Choice Questions, though easily auto-marked, also carry language bias, primarily test comprehension not creative explanatory skills, and are susceptible to the negative suggestion effect (Roediger & Marsh, 2005).
Here we describe BioLogic, a novel software tool designed to address these limitations. Biologic supports students in constructing grammatically correct and scientifically precise explanations by combining modular ‘building-block’ statements (independent clauses) using logical connectors (conjunctions). By selecting from key word options, students can modify the meaning of sentences and customise the wording of connectors, enabling precision and clarity while reducing language-related barriers. This structured format also facilitates automated marking.
To evaluate the user experience of BioLogic, we deployed it in a second-year biomedical workshop and invited students to complete an anonymous, mixed-methods survey at the end of their session. This included 20 Likert-scale items across three domains—self-assessed competency, interface design, and comparison with written exam responses—followed by open-ended questions on likes, dislikes, perceived educational value in workshops, and feasibility for assessment use. Results were varied. While 82% (n=50) of students felt BioLogic would help them learn concepts during workshops, 66% were opposed to its use in exams. Nevertheless, most students believed answers created in BioLogic would be more precise (84%) and comprehensive (61%) than their own written responses, and that the tool allowed sufficient freedom of expression (59%). In contrast, most felt it would be slower (66%), more difficult (61%), and produce lower-quality English (66%). Strikingly, students most supportive of its use in exams self-identified as having weaker exam technique, English proficiency, and logical reasoning skills. Thus, BioLogic may specifically offer particular benefits to lower-performing students.We are currently analysing subsequent summative assessments, aimed at evaluating BioLogic impact on student performance and explanatory skills, and exploring how these effects vary across different levels of academic achievement. Ongoing development is focussed on building a Grading Engine that will provide real-time feedback and enable automated grading.
REFERENCES
Alameh, S., Abd‐El‐Khalick, F., & Brown, D. (2022). The Nature of Scientific Explanation: Examining the perceptions of the nature, quality, and “goodness” of explanation among college students, science teachers, and scientists. Journal of Research in Science Teaching, 60(1), 100-135.
Braaten, M., & Windschitl, M. (2011). Working toward a stronger conceptualization of scientific explanation for science education. Science Education, 95(4), 639-669.
Roediger, H. L., 3rd, & Marsh, E. J. (2005). The positive and negative consequences of multiple-choice testing. J Exp Psychol Learn Mem Cogn, 31(5), 1155-1159.