Page 35 - Campus Technology, May/June 2018
P. 35
DATA ANALYTICS
continued from page 31
try to see where students are,” he noted. “And those clickers do generate some pretty good data sets.”
Data capture will determine your predictions. Student success depends on myriad variables, and if your instructors don’t design the course in the LMS to gather data (assign- ments, tests, online homework, attendance, etc.), the predic- tive results won’t be valid. For example, class attendance might be a predictor for a given course, but if the instructor doesn’t bother keeping attendance, there’s no way to know how that will influence the outcomes.
Don’t delay. “Ideally, you would want to give feedback to students within the first four weeks of the semester to make sure they’re getting aboard on course concepts and course- work,” noted Fernandes. “If you’re not going to be able to give a prediction that they might fail until the sixth or seventh week (before mid-terms), it’s too late.” And if the course is too complex to know by week four, maybe it’s not the right place to start with predictive analytics. Likewise, if you can’t start at the very beginning of the semester, hold off, she advised. “It’s difficult to catch up with this predictive model idea when you miss the first two weeks.”
Be choosey. An overriding consideration for Bayard is to have “very good diagnostic tools for selecting courses and instructors.” In order to have an LMS course that can give you good data to determine a valid predictive model of student success, you have to have faculty willing to engage in the
Class attendance might be a predictor for a given course, but if the instructor doesn’t bother keeping attendance, there’s no way to know how that will influence the outcomes.
course design either themselves or with the help of an instruc- tional designer. For instance, while faculty are accustomed to setting up their gradebooks the way they want, right now, the infancy of the learning analytics software requires the grade- book to be set up in a particular way in order to have the kind of data that will be of value to the predictive modeling.
Boss buy-in is essential. Without the support and interest of learning analytics in upper administration, “forget it,” sug- gested Fernandes. “This is a long-term thing. You have to be committed to it. It takes multiple units on campus to talk with each other. If you think, ‘Oh, we’ll stand it up for a year and see if we get return on our investment,’ we’ll tell you don’t do it.”
Leapfrog the learning. San Diego State, another Black- board Learn user, began using Predict a semester later than Chico State, enabling it to strengthen its approach. For exam- ple, by the time the university implemented Predict, it had
already created “really effective, rich media” to signal to stu- dents that the school was paying attention to student engage- ment in the courses. “They did some really fun things,” said Fernandes. One example: a video of an empty chair in a classroom with the message, “We noticed that you were absent today.” That would be accompanied by data showing what students’ chances of passing the course would be if they attended class vs. if they didn’t attend. You have to create clear road signs for students and faculty so no one gets lost.
Predicting success or failure is just the start. While it may be useful for the institutions themselves to understand the factors that lead to success or failure, not everybody is con- vinced that the students need to know what side of the line they fall on. Chico, for example, declined to show students their predicted failure, while San Diego State chose to inform students. In doing so, however, the university also let students know to “take it with a grain of salt, to not give up,” Fernandes said. “They did it much more gingerly.”
Plus, the decision-making can’t end there, she added. “You shouldn’t get into learning analytics just to find out who’s potentially going to fail and not have the rest of the institution on board with what are you going to do about it once you know that.” And approaches for how to help students based on their risk factors is yet another large institutional project.
Dian Schaffhauser is a senior contributing editor for CT.
35
CAMPUS TECHNOLOGY | May/June 2018