Using Learning Analytics to Evaluate the Instructional Design and Student Performance in a Large-Enrollment Scientific Computing Workshop
Description:
Addressing today’s complex scientific challenges will require geoscience curricula to develop skills and competencies, such as scientific computing, in addition to subdisciplinary content knowledge. Online training shows promise for teaching domain-specific scientific computing and for broadening access. However, what results in effective course design, pedagogy, and retention needs further exploration. Online courses such as the Seismology Skill Building Workshop, which helps students build scientific computing skills through seismology-focused programming, generate rich datasets of student interactions and performance that can be used to explore the effectiveness of instructional design to a degree not possible in traditional courses. The workshop consists of over 35 interactive assignments with more than 1000 questions. It was designed to engage students in key skills (e.g. seismology, programming, quantitative literacy), frequently with higher order thinking, and with increasing challenge as the workshop progresses. To explore the efficacy of the instructional design, we created a table of specification categorizing each question from the 2022 course according to skill and Bloom's Revised taxonomy. We then compared the distribution of categories to the design intent and examined each category using facility and discrimination indices to evaluate students' performance.
Results indicate the instructional design of the course did not fully align with the original intentions. For example, 80% of the questions fell within the lowest 2 levels of Bloom's taxonomy. In addition, the rigor did not increase through the workshop. Changes enacted in 2023 increased the number of higher order thinking questions, progressively reduced scaffolding, and reduced the number of no-skill questions and increased the number of multi-skill questions. To evaluate the impact of these changes on student learning outcomes, a pre/post test targeting each of the categorized skills was administered in 2023. Results show improved performance for all of the key skills and more consistent and larger gains for higher order thinking.
Session: Leveraging Cutting-Edge Cyberinfrastructure for Large Scale Data Analysis and Education [Poster Session]
Type: Poster
Date: 5/2/2024
Presentation Time: 08:00 AM (local time)
Presenting Author: Michael
Student Presenter: No
Invited Presentation:
Authors
Gillian Haberli gillianhaberli@gmail.com Miami University (now at EarthScope Consortium) |
Michael Brudzinski Presenting Author Corresponding Author brudzimr@muohio.edu Miami University |
Michael Hubenthal michael.hubenthal@earthscope.org EarthScope Consortium |
|
|
|
|
|
|
Using Learning Analytics to Evaluate the Instructional Design and Student Performance in a Large-Enrollment Scientific Computing Workshop
Category
Leveraging Cutting-Edge Cyberinfrastructure for Large Scale Data Analysis and Education