EDLI’s evaluation of Packback provides insight for future software evaluation processes

  • Overview: We conducted a pilot evaluation of Packback Questions, a D2L-integrated discussion forum that utilizes AI to provide students with immediate feedback on their writing, encouraging them to develop skills for critical analysis and inquiry.
  • Outcomes:
    • Packback seems to encourage instructors to utilize discussion features, particularly in large courses
    • We did not find improvements in student engagement (D2L log ins) or grades in Packback courses
    • Our evaluation process revealed the need for quasi-experimental design in future evaluations

Packback Questions is an AI-enhanced and D2L-integrated discussion forum with a focus on increasing student engagement by encouraging learners to ask critical questions and give meaningful feedback. The EDLI team has conducted an evaluation using data from Fall 2022 to assess how Packback impacted students’ learning across metrics of D2L log-ins as a measure of course engagement and students’ final grades. As this evaluation was time-sensitive, as we were starting during the middle of the Spring 2023 semester and aiming to have our data by the end of the semester. Therefore, we were limited to past data that had already been collected.

Comparison Courses

We compared courses that used Packback in Fall of 2022 to a similar set of courses that did not use Packback. Comparison courses were of similar sizes and the same modality. Ideally, we sought identical course codes with different instructors, or the same course and instructor in a prior semester. However, this close comparison was rare; most comparison courses were different courses in the same department. This introduces additional noise into our data as we are not able to control for several important factors in each course.

Packback’s Impact on Discussion Boards, Student Grades, and Engagement

We first explored whether comparison courses had a D2L Discussion board in their course set-up. We found that most that had a D2L Discussion board present did not actively use the feature. The lack of discussion posts itself is an interesting data point when comparing Packback and comparison courses. More details about the presence of discussion posts can be viewed in the Sankey diagram below: 

The diagram depicts the flow of classes based on their D2L information and discussion activity. Starting from the left:

There are 36 "Total classes."
Out of these, 27 "Have D2L Info," while 9 have "No D2L Info."
From the 27 that "Have D2L Info":
13 have "Discussion posts present."
14 have "No discussion posts."
From the 13 with "Discussion posts present":
3 have "Active threads."
4 are marked as "Unknown."
6 have "No active posts."
The flows and connections are color-coded, and the width of each flow represents the proportion of classes in each category. The diagram was created with "SankeyMATIC."

Packback does seem to influence whether instructors utilize online discussion boards or not, particularly for large courses. Among comparison courses with over 60 students, none of them used D2L discussion boards; fourteen Packback courses had over 60 students, with nine courses having >100 students.

There were no differences between Packback and comparison courses in student grades or engagement as shown by D2L log-ins. In large courses, the distribution of students’ grades was significantly lower in courses that used Packback than those that did not. We can’t draw any causal conclusions from this finding, especially given the inexact matching between our Packback and comparison courses.

Future Software Evaluation Processes

The tight timeline and poorly matched comparison data created challenges in measuring Packback’s impacts. EDLI will use our experience in this evaluation to make recommendations on future evaluations of educational technology at MSU. A strong priority will be a sufficient timeline to develop experimental or quasi-experimental designs in evaluating software.