Page 14 - College Planning & Management, June 2018
P. 14

Emerging Technology ENHANCING, ENGAGING, CONNECTING
Learning Analytics
Closing the loop between delivery and impact.
BY DAVID W. DODD
TODAY’S INSTITUTIONS ARE typically characterized by widespread use of learning management systems, such as highly regarded Canvas. Our physical classrooms feature projectors, smartboards, sound systems, and similar tools from recog- nized vendors including Epson, Panasonic, NEC, and Smart, among others. Online courses
are supported by Zoom, Kaltura, Cisco, and other digital media and conferencing technologies. The teaching and learning environments we’ve constructed are robust and feature-rich. But how do we know they are working? How do we know if students are actually learning more effectively as a result of these investments? Work has begun in this area of assessment. But there is far more to do.
As the late William Edwards Deming is noted for saying, “In God we trust; all others bring data.” Various studies have been conducted over the years aimed at assessing the impact of technol- ogy on teaching and learning. Many were relatively informal and not based on sound research methodology. Further, such research involves humans, which are far more complicated and challenging than, say, determining the specific gravity of a mineral. Quantita- tive studies are inadequate. Similarly, simply asking students how they “feel” about technology is meaningless. Previous studies have been all over the proverbial board in this regard.
One approach that began about a decade ago and that is rapidly growing in popularity is called learning analytics. As defined by the EDUCAUSE Learning Initiative (ELI), learning analytics is the use of data, analysis, and predictive modeling to improve teaching and learn- ing. The term is thrown around so much and so loosely that its inher- ent value is often lost, and that is most unfortunate. Learning analytics in actual usage is both sound and substantive. This field takes into ac- count the complex nature of humans, as well as the fact that individual learners are inherently different. Learning analytics also stipulates that learners act within a context that must be considered. Fundamen- tally, this equates to the difference between rote, recall, and grades on the one hand, and learning, retention, and application on the other.
The Goal is Improved Learning
Another critical aspect of learning analytics is that the fundamen- tal goal is improving learning. Rather than grading, which is trans- actional, the goal of learning analytics is to be transformational. The purpose is understanding the complex process of human learning and finding ways to improve it. In the end, this can include accounting for
individual student needs and strengths, learning environments, prac- tices and behaviors, tools and technologies, and many other factors. The information upon which learning analytics is based uses
technologies that include learning management systems and next-generation student information systems. And in a bit of irony, research concerning this information is used to further strengthen not only these technologies, but also pedagogical practices, campus environments, “nudge” factors such as coaching effective student practices and behaviors, and other important elements.
When practiced effectively, learning analytics is a strategic rather than an operational endeavor. It is not merely quantitative optimization, but rather thoughtful and innovative. It does not assume the efficacy of legacy practices, but rather challenges them. It is not institutionally or teacher-centered, but is truly student- centered. Indeed, it is based on finding ways to most effectively fulfill the central mission of our institutions—student learning.
Look Beyond Legacy Practices
In order to build campus environments that can benefit from learning analytics, institutions must first look beyond legacy practices that are transaction-based. Doing so means incorporat- ing more considerations into the selection of systems, technolo- gies, and professional development programs for faculty. It means selecting systems not merely for successfully fulfilling transac- tions, but more importantly, for the information they provide to educators. Excellent systems are both available and emerging that incorporate these principles. They include such standouts as the Instructure Canvas LMS, the emerging Workday Student system, and systems being developed internally at visionary universities.
To be clear, learning analytics is not based on choosing the right technologies. It is based on creating a campus culture that places a high value on constantly rethinking and improving stu- dent learning. As a result, learning analytics is a holistic endeavor.
With innovative approaches, there are inevitably correspond- ing challenges. With learning analytics, these include respect-
ing student privacy, valuing individual learner differences, and fostering conversations about student learning, among others. But students are at the core of our institutional missions. Constantly improving student learning is our most important challenge. CPM
David W. Dodd is vice president of Information Technology and CIO at the Stevens Institute of Technology in Hoboken, NJ. He can be reached at 201/216-5491 or david.dodd@stevens.edu.
14 COLLEGE PLANNING & MANAGEMENT / JUNE 2018
WEBCPM.COM














































































   12   13   14   15   16