Categories
Case Studies Institutional Use

What’s the evidence for learning analytics?

Learning analytics is a new area of activity, but is being built on top of other disciplines with a long history. Some, such as statistics, have been around for centuries; others like social network analysis are much more recent. But there is not yet much published evidence of direct impact from learning analytics on measures of student success, such as grades and retention.

two students and laptop

Clearly we’re still near the start of a journey in understanding how best to use the rapidly accumulating amounts of data we have on students and their learning to improve their educational experience and outcomes. Many projects are being carried out on a shoestring by enthusiasts and small groups of researchers, and few of the bigger institutionally-funded initiatives have been in place long enough to demonstrate their long-term impact.

Even if the analytics give us real insight into student learning and what we might do to enhance it, we don’t necessarily yet have in place the best intervention strategies to use that knowledge. And to close the loop, the learning analytics cycle discussed by Doug Clow, we need to evaluate the effectiveness of any actions we take on the basis of the analytics, and then see how we can build the best interventions into institutional policies and the practices of individual teachers and tutors.

Learning analytics and student success - Jisc briefingIn order to invest in learning analytics – even to pilot it at sufficient scale to assess its effectiveness – institutions will be looking for evidence from elsewhere of return on investment. To this end we’ve summarised some of the findings which have been published, and appear to show a demonstrable positive impact. See the Jisc briefing: Learning analytics and student success – assessing the evidence.

One caveat, of course, with any published evidence is that it may be in the interests of the authors or their organisations to portray the outcomes of their initiatives in a positive light. Researchers who show that learning analytics is working may receive further funding. Vendors who publish such evidence in white papers stand to sell more of their products. A second caveat is that the implementation of learning analytics may be one of a series of changes, and any improvements may be difficult to attribute to its use.

However, I’ve encountered a consistent honesty and striving for truth in this research community (and among the vendors too). No-one seems to be making any grand claims for the efficacy of learning analytics: if they were, and couldn’t justify them, they’d be pretty quickly knocked off their perch by a skeptical educational research community. The studies assembled in the LACE project’s Evidence Hub are overwhelmingly positive, but a few less successful ones are also included.

Most people working in this area believe that having better data at our fingertips about our students, so long as it is handled ethically, has the potential to impact positively on both learning and teaching. This is what is driving us in Jisc in our efforts to help the sector develop its understanding of learning analytics. The studies summarised in this brief report are some of the initial indicators that the community is on the right track. Those which carry out A/B testing, where one group is subject to interventions on the basis of the data and another group isn’t, are some of the most convincing.

By Niall Sclater

Niall Sclater is Consultant and Director at Sclater Digital Ltd and is currently carrying out work for Jisc in Learning Analytics.

Leave a Reply

Your email address will not be published. Required fields are marked *