Categories
Co-design Institutional Use Legal Issues Metrics Policy Systems

Second pathfinder meeting: addressing common institutional challenges

Group discussion in Jisc's London offices

At our recent pathfinder meeting, a number of institutions involved in implementing Jisc’s learning analytics architecture came together to work on issues of common concern. This followed a get together in Bristol last December where we looked at institutional culture, ethical & legal issues and data.

Participants from the different universities present discussed a different range of issues this time, which I describe below (with thanks for notes from Paul Bailey, Tony Sceales and Tim Stratton).

Business transition to using learning analytics
Tony Sceales, an entrepreneur brought in by Jisc to work on areas such as assisting institutions in building their business cases, facilitated discussions in this area. Tony has developed a tool which helps to quantify the costs and benefits of an institutional project.

One participant described how his university is at a fairly advanced stage and has put together a student panel to ensure that learners’ voices are incorporated. Learning analytics is currently being considered there as an R&D project to discover whether it can enhance student success. The institution has been looking at Eduroam logins as a proxy for attendance, which has helped understand building utilisation at different times of the day by staff and students. Library logins and accesses to digital content are also being analysed. There is particular interest in attempting to understand which curriculum structures lead to the best outcomes.

Another member of the group described how her university was still at a relatively early stage. She felt that the relevant pro vice chancellors had not been engaged early enough, and that executive level engagement is critical for success. This is a finding that is borne out in a number of studies, including a useful Australian report by Colvin et al.

She was also seeing wide variations in engagement across different schools due to cultural and practical issues such as budget reductions. She felt that “Do no harm” is a helpful guiding principle for learning analytics and that the forthcoming EU General Data Protection Regulation will be a driver for institutions in sorting out their processes around student data.

Adapting predictive models
This group also discussed whether it would be possible to turn on and off the factors used in the predictive model and the interventions they drive.  Whilst this is theoretically possible, the group felt that the accuracy of the model would be likely to be reduced with each metric removed. However, studies such as that reported in the seminal paper from Marist College by Sandeep Jayaprakash and colleagues. have shown that some indicators may make very limited impact on the predictive abilities of a particular model.

Jisc is developing a single/common predictive model to include in its standard offering, but it will be possible to procure further models to be built either by vendors or by Jisc. We expect vendors to start to differentiate themselves in this space, and some may decide to share their models freely.

The models will of course be imperfect and will deliver some level of false positive predictions for students at risk.  These will need to be carefully managed in the context of the relationship between teachers and students, given the ‘Do No Harm’ principle. As we transition from an R&D project to an operational service the group members considered it important to recognise the need to mediate between the predictions and the student.  This is especially true where the prediction is based on environmental or demographic factors rather than metrics for attainment or engagement. The role of the student retention office(r), where one exists, in determining intervention processes and the model of recording interventions was thought to be key.

Developing a picture of a successful end-state for learning analytics
It was also requested that pathfinder institutions could be helped to develop a picture of what a successful end-state looks like for learning analytics, and what the journey is to get there.  User stories, case studies, testimonials and best practice would all be useful.  Knowing which factors most heavily impact the outcomes and accuracy of predictions would help to build an evidence base we can rely and build on.  Key metrics were thought to be VLE usage, attendance, timely assignment completion and assessment results.

It was a desire to cover just these sorts of issues for institutions that led me to write my recent book, Learning Analytics Explained. A growing range of publications may also prove helpful in this regard, including our review of UK and international practice, Learning analytics in higher education. Insight gained as we roll out Jisc’s architecture will no doubt continue to be shared too via this blog.

Budget implications
Tony Sceales presented his business costing tool for learning analytics which prompted a number of thoughts from participants. One of these was that you will always lose some students regardless of what you’re putting in place to try to prevent this – moreover there is a cost to intervention and you do save some costs when students drop out.

Someone else made the point that investment in the institution is required over and above the costs of the software. Well, this is no surprise to anyone who has tried to implement new software at educational institutions – and in the case of learning analytics, it will be essential to help staff understand how to interpret the data and what interventions they should be carrying out that are likely to result in the biggest impact. A way to calculate such costs is included in Tony’s costing tool.

People also wanted to know about ongoing costs for the Jisc solution. These are being worked on at the moment and will be shared soon.  Jisc’s offering will include the Learning Records Warehouse (LRW), the Apereo open source predictive engine and a basic predictive model – as well as Study Goal and Data Explorer. More sophisticated analytics tools which integrate with the LRW will be able to be procured from leading vendors.

Consent
In this discussion we looked at issues around obtaining student consent and complying with the forthcoming EU General Data Protection Regulation, which will need to be implemented by universities. Our suggested way forward is fully explained in our earlier post: Consent for learning analytics: some practical guidance for institutions

Key points were:

  1. It’s unreasonable to expect students to understanding what they’re signing up to by consenting to data collection and interventions on the basis of learning analytics on day 1 of their studies
  2. Using consent as the justification for collecting (most) data is not necessary or appropriate anyway and seriously restricts the possibilities for its use further down the line
  3. Justify data collection on the legitimate interests of the organisation, with the exception of sensitive data where consent must be obtained
  4. Obtain consent for any interventions you wish to offer to students

Interventions
Paul Bailey led this discussion, which looked at issues around integrating learning analytics with existing student support and tutorial processes. Most institutions have a personal tutor system in place but are concerned about the timing and frequency of tutor meetings to provide timely interventions. There is also a potential increased workload for tutors if they receive regular alerts on students at risk.

Student support services offer another important source of interventions to help students at risk. Some institutions are putting in place specific roles e.g. student retention officers who focus on supporting the students most at risk or those identified as likely to benefit most from timely interventions.

The key challenge in any institution seems to be to develop a holistic view of the support and interventions being targeted at individual students and to ensure a coordinated approach from all staff involved. Many institutions have tutor dashboards and/or student services systems but these are rarely joined up. The Jisc learning analytics service offers a centralised place to hold and share information on interventions made with individual students. The Data Explorer tool will provide a reference implementation to record and view interventions on students and allow local and vendor solutions to integrate with the Learning Records Warehouse.

Jisc Learning Analytics Service
Jisc is now finalising plans for moving from project mode to the provision of a learning analytics service. Michael Webb outlined the plans for this. One participant wanted to know how to request changes and how institutions would be notified about changes. Michael pointed out the difference between fixes and changes. Bugs are fixed as quickly as possible. The data structures will be renewed around Easter, with tools updated accordingly at the start of summer. More substantial changes will take place during the summer when there will be a new major release. Less significant “non-breaking” changes will occur throughout the year. Participating institutions will be informed of the processes for submitting change requests imminently.

Study Goal, Jisc’s student app for learning analytics, is now available in demo mode (versions for Android & IOS) for institutions to try out. One participant asked for the ability for other apps and systems to be integrated with the Learning Records Warehouse (LRW). Attendance monitoring is one such application, and this is currently being trialled with some institutions. Someone pointed out that it is nearly impossible to record all attendance at universities – however, even partial records of attendance (or non-attendance) could be sufficient to flag at-risk students.

 

By Niall Sclater

Niall Sclater is Consultant and Director at Sclater Digital Ltd and is currently carrying out work for Jisc in Learning Analytics.

Leave a Reply

Your email address will not be published. Required fields are marked *