Page 29 - Campus Technology, October/November 2019
P. 29
Predictive analytics may be our finish line, but regression discontinuity was the starting line — and it’s a methodology that any university can apply when designing first-year experiences and targeted interventions. and receptivity to counseling to predict which students may encounter the most academic hurdles. Students took the survey upon accept- ing their offer to attend John Carroll and again at mid-term of second semester. We used a stanine scale, which runs from 1 to 9; students who scored 5 or higher on predicted academic difficulty were eligible for the integrated learn- ing community outlined above. This was the “gold group.” For other students, it was business as usual — they comprised the “blue group.” We chose a relatively low cut-off for the inter- vention because we wanted to ensure we were supporting students that fall in the “murky mid- dle,” as the academic literature calls it. Students who fall right in the middle — hovering between a 2.0 and 3.0 GPA, for instance — aren’t usually offered support teams or an emergency response. Those approaches are saved for the extremes. These mid-level students still have a higher chance of withdrawing from school. Their GPAs might be okay, but something else could be causing challenges. We wanted to proactive- ly identify these students and those challenges. At this point, it’s important to note that the goal of our project wasn’t to prove integrated learn- ing communities represent the best first-year experience for every institution. Instead, we sought to demonstrate a useful methodology — a way schools that wanted to implement a program aligned with their own goals could identify which students to enroll in the program, and then track those students’ progress. Developing Predictive Analytics The learning intervention itself is only one com- ponent of our grant. We are also working to bet- ter identify factors predictive of student suc- cess and, in turn, to develop predictive analytics capabilities to proactively identify especially at- risk students. To do so, we must be able to see and under- stand a significant amount of data. We’ve now had three cohorts of first-year, first-time stu- dents, with more than 600 variables in each data set. While the College Student Inventory survey was used to determine which students got the intervention, we also used two other surveys throughout the course of our research: EQ-i, an emotional intelligence questionnaire, and Thriving Quotient, a holistic student survey developed at Azusa Pacific University in Califor- nia. Of course, we also have data from standard outcomes like credit accumulation and GPA and standard demographic information. During the second half of our grant, we start- ed working with GlyphEd, which makes a unique tool that brings all of this data together and dis- plays it in a way that’s far more intuitive than SPSS printouts. GlyphEd uses “glyphs” — three- campustechnology.com 29