Categories
Metrics Systems

More sophisticated learner engagement metrics – and doing something with them

I’ve described some of the basic reporting functionality available for Moodle and Blackboard but this is just scratching the surface of what is possible with learning analytics.  In this post I’ll look at ways in which analytics from other data sources such as video and ebooks are being brought into the VLE to help gain a better picture of student engagement.  I’ll also describe a system which makes analytics on their activities available to the learners themselves – and helps staff to manage subsequent interventions.

Matteo: "Giorgia - leggete gli e-book!"  CC BY-NC 2.0
Matteo: “Giorgia – leggete gli e-book!” CC BY-NC 2.0

Adding analytics from e-book usage
While most learning analytics systems use data available in the virtual learning environment to measure engagement, VitalSource CourseSmart Analytics delves into the students’ use of e-books in order to help “improve retention, control costs and improve learning outcomes”. VitalSource is a company which rents out textbooks and enables usage monitoring through their own e-book reader software.  The system assesses engagement using various metrics including the duration of the e-book reading session, the number of pages viewed, and activities such as highlighting or taking notes.

The (as yet fairly simplistic) analysis is presented in a dashboard which can be viewed from within the virtual learning environment.  It includes an “engagement index” derived from the usage data.  The company claims that its research shows that this index is a stronger predictor of student academic outcomes than their prior educational attainment.

The dashboards are built on top of the GoodData business intelligence platform, and can be viewed from within the VLE.  The software uses the IMS Learning Tools Interoperability framework to make data available to the VLEs.

Various US institutions which have deployed the software are profiled on the company website and in some white papers.  The suggested users of the product are: instructors (to assess performance and intervene as appropriate), deans and other senior staff (to assess the uptake and effectiveness of e-books), and publishers (to assess the relative impact of digital course materials in order to improve their offerings).  There’s no suggestion by VitalSource that learners would benefit from viewing the data, and whether students are comfortable about having their ebook usage analysed in this way is another matter.  I did some thinking about this a while back in: Making ebooks more interactive: logistics and ethics.

Analytics from video viewing and contributing
One thing not handled very well by most VLEs is video.  Various plugins have emerged to deal with this, and notable amongst them is Kaltura, an open source system available for all the main VLE products.  Kaltura deals with the content management aspects of hosting videos, and enabling contributions by students as well as staff.  It also provides analytics on the viewing and contributing of video clips.  This allows staff to see:

  • Which videos are students watching the most?
  • Which students contribute the most videos?
  • Which students watch the most videos?
  • How long are students watching each video?

This information can certainly help you discover what your most engaging video content is.  It can also give some indication of engagement for individual students both in viewing and posting.  A table shows the top five most engaged users including how many videos they watched, how many minutes they spent, what their average view time was, and their “average drop-off” percentage i.e. how much of the videos did they actually watch.

This is very limited though for the purposes of learning analytics; a natural evolution for the functionality would be to produce an indicator of student engagement in viewing video (and contributing it if appropriate).  In a similar way to the CourseSmart Analytics engagement index this could be made available to the VLE together with other engagement data to build a fuller picture of student participation in courses with high video content or a requirement to contribute video.

The Kaltura website lists a number of high profile US universities as customers together with Durham and Manchester Metropolitan in the UK.

Course Signals
A more sophisticated engagement reporting system than the ones I’ve described so far for Moodle and Blackboard is Course Signals.  This was originally developed at Purdue University and is now one of five products which comprise the Ellucian Student Success Suite.  It’s received much publicity due to claims of dramatic positive correlations between use of the software and measures of student success.  Retention in particular was claimed to be 21% higher among students who had used the system.  However Purdue’s findings were subsequently challenged as not necessarily demonstrating a causal relationship.

Like the other simple VLE reporting tools, Course Signals provides indicators of whether students are on track to complete their course, based on their online participation.  The software was built to integrate with VLEs including Blackboard Learn and Desire2Learn Brightspace.

At the heart of the system are traffic light indicators, displayed in the VLE, which tell students if they are performing well, holding steady or underperforming, and prompts them to take action.  So a staff member might specify a minimum performance requirement for a particular course.  If a student’s performance falls below this a red signal is provided on the staff dashboard and an email sent to the student.   A yellow signal shows that a student’s performance is approaching the minimum acceptable level, while green suggests that a student is doing what is required to pass.  The main metrics are:

  • Grade – allows you to set value ranges for grades which equate to red, yellow and green signals.
  • Effort – a measure of how much a student uses specified course resources in the VLE within a specified date range.

It’s possible to filter students based on the red/yellow/green “signals” that are generated for grade and effort, for example those who have worsened in a class or those who have red signals in two or more classes.

There are at least five things which make Course Signals much more sophisticated than the basic reporting tools for Moodle and Blackboard:

  1. it can bring in data from outside the VLE such as prior qualifications
  2. it presents the indicators to students as well as staff
  3. it works across courses rather than just on a single course
  4. it can trigger automated interventions
  5. it adds workflow management for interventions

Tools are provided for detecting at risk students, triggering a response, setting up and tracking an action plan and monitoring its success.  It’s possible to tailor and track your communications using the system, and to add automated event-triggered reminders to students to carry out specified tasks.

With this level of workflow management Course Signals begins to have the feel of a customer relationship management system rather than a simple VLE reporting system.  The indicators of engagement, with all the potential data sources behind them, are boiled down to simple red/yellow/green traffic lights for grade and effort. But these then trigger a range of automated and human interventions and communications which are tracked by the system.  You can have all the metrics you like for measuring engagement but effective management of interventions is what could really start making an impact on student outcomes across an institution.

 

By Niall Sclater

Niall Sclater is Consultant and Director at Sclater Digital Ltd and is currently carrying out work for Jisc in Learning Analytics.

One reply on “More sophisticated learner engagement metrics – and doing something with them”

Leave a Reply

Your email address will not be published. Required fields are marked *