Page 22 - THE Journal, May/June 2018
P. 22

to uncover any misconceptions that might stand in the way of a student’s learning. The platform then uses this information to coach students up in areas where they’re having trouble before proceeding with a lesson.
The software is based on technology devel- oped by the Center for Game Science at the University of Washington. The technology has been shown to improve student mastery of seventh grade algebra by an average of 93 percent after only 1.5 hours of use — even by students as young as elementary school.
“AI can enable a much higher level of personalization,” said Zoran Popovic, director of the Center for Game Science and founder of Enlearn. “It can help deliver just the right curriculum that a student needs at a particular moment. That’s what we’re work- ing toward with Enlearn.”
Improving Student Safety
AI also has implications beyond the class- room. For instance, GoGuardian, a Los Angeles company, uses machine learning technology to improve the accuracy of its cloud-based Internet filtering and monitoring software for Chromebooks.
URL-based filtering can be problematic, GoGuardian said, because no system can keep up with the ever-changing nature of the Web. Instead of blocking students’ access to questionable material based on a website’s address or domain name, GoGuardian’s soft- ware uses AI to analyze the actual content of a page in real time to determine whether it’s appropriate for students.
Developers have fed the program hundreds of thousands of examples of websites that
the company deems appropriate — or not appropriate — for different age levels, and
the software has learned how to distinguish between these. What’s more, the program is continually improving as it receives feedback from users. When administrators indicate that a web page was flagged correctly or blocked when it shouldn’t have been, the soft- ware learns from this incident and becomes smarter over time.
Like many Internet-safety solutions, GoGuardian also sends automated alerts
to a designated administrator whenever students perform a questionable web search
or type problematic content into an online form or Google Doc. This kind of real-time behavioral monitoring can prevent students from harming themselves or others: If an administrator sees that a student is searching for information about how to kill herself, for example, he can follow up to get that student the help she needs immediately, before it’s too late.
Traditional keyword flagging generates many false positives, because the software can’t determine the context or the student’s intent. But with AI, these real-time alerts become much more accurate, users of the technology say.
Shad McGaha, chief technology officer for the Wichita Falls Independent School Dis- trict in Texas, said he used to receive upward of 100 email alerts per day. “They would fill up my inbox with false positives,” he said, and he would have to spend valuable time comb- ing through these alerts to make sure students weren’t in danger.
Since GoGuardian began using AI to pow- er its software, he now receives only a handful of notifications — and all but 2 percent or 3 percent of these result in actionable insights.
“A few months back, we had a student who was flagged for self-harm. We were able to step in and stop that from happening,” he said. “That’s pretty incredible.”
Key Considerations
Despite its potential, AI has limitations, as well. “AI works best when it has a vast num- ber of samples to learn from,” Popovic said. It can be challenging to get a large enough sample size for the technology to be effective in a high-stakes environment like education, he explained — where teachers can’t afford to make mistakes with students.
For AI to be effective, the data it uses
to reach its conclusions must be sound. “If data that are not accurate are thrown into the mix, it will return inaccurate results,” Calhoun Williams said. There is no such thing as unbiased data, she noted — and bias can be amplified by the iterative nature of some algorithms.
The technology also raises serious privacy concerns. It requires an increased focus not only on data quality and accuracy, but also
on the responsible stewardship of this infor- mation. “School leaders need to get ready for AI from a policy standpoint,” Calhoun Williams said. For instance: What steps will administrators take to secure student data and ensure the privacy of this information? “If necessary, over-communicate about what you’re doing with data and how this will benefit students,” she advised.
Some adaptive learning products don’t use true machine learning technology, but rather a form of branching technology where the software chooses one of several routes based on how students respond to the content.
“Educators have to be careful of compa- nies claiming to use AI in their software,” she warned. “Some vendors are latching on to the AI bandwagon, but make sure you know what they mean when they make this claim.”
Some questions to ask of vendors include ... • What does AI mean to you? How does this product fulfill that definition?
• How is your product superior to current options with no AI?
• Once I install your product, how will its performance improve through AI? How should I expect to devote staff time to such improvements?
• What data and computing requirements will I need to build the models for this solution?
• How can I see what will happen to the data used by your software?
Another consideration is: What is the scope of the learning environment that can be varied? Presenting the same material
in a different sequence has little impact on student outcomes, Popovic explained: “If all IcandowithAIissendyoutochapter7of the material, that’s not enough to make a dif- ference in learning. But if the software were to generate new content on the fly based on how students are responding, that would be a game changer.”
Dennis Pierce is a freelance writer with 20 years of experience covering educa- tion and technology. Alice Hathaway
is a high school senior at Maynard High School in Massachusetts.
| MAY/JUNE 2018

   20   21   22   23   24