Jisc’s Learning Analytics Network got off to a great start last week with a workshop hosted by the University of East London. The event was fully subscribed, with around 50 people attending from 31 different institutions, showing the high level of interest in the area in the UK.
Staff and students very positive about dashboards at Nottingham Trent
Mike Day, Director of Information Systems at Nottingham Trent University, gave the first presentation. NTU is particulary advanced in its use of learning analytics. Mike discussed how his university already has good retention levels but wanted to use data to better inform interventions, improving attainment and a sense of belonging for students. A dashboard was built using HP Autonomy and is now in use across the institution, combining biographical information with data sources such as door swipes, library loans and VLE use. This enables comparison of engagement across a cohort and raises alerts if students appear to be disengaged.
Students are “strongly positive” about the dashboard, with 93% of them wanting to be warned if they’re at risk of failure. Staff are also very positive. The dashboard confirms that engagement is the strongest predictor of progression, and shows that groups with historically poorer progression and attainment do have different levels of engagement. For these groups engagement is a stronger predictor than demographics.
Analytics to improve retention at Huddersfield
Next up was Sarah Broxton, Strategic Planning Officer at Huddersfield University, who presented on Huddersfield’s work to improve retention with the use of data. Despite Huddersfield’s improving NSS scores and league table positions there has been a strategic requirement to improve retention rates and institutional effectiveness and efficiency. Meanwhile attendance monitoring and a centralised timetable system have been introduced, and there’s a need to inform staff better about the data available to them.
Mapping leaver characteristics such as age and entry qualifications to current cohorts, together with attendance data, reports of students most likely to leave early were produced, and communicated to personal tutors and other staff, encouraging them to get in touch. As with other large IT projects, Huddersfield found the technical issues relatively easy to solve – it’s changing human practices and processes that creates the challenge. However through increased transparency and training for colleagues acceptance of learning analytics is increasing.
Trying a “skunk-works” approach
Roger Greenhalgh, Jisc’s Strategic IT and Organisational Leadership Advocate, then talked about a “skunk-works” approach to analytics. Roger showed how small innovation units created within organisations, but relatively free from procedures and regulation, can develop analytics more quickly than traditional IT departments.
Engagement analytics at the University of East London
Gurdish Sandhu and Simon Gill from the University of East London, discussed their Information Strategy and Student Engagement Analytics, combining attendance monitoring data with VLE usage data and other sources to indicate students at risk of drop out. The University uses QlikView and is at the forefront of deploying useful dashboards for a range of learning analytics applications.
Jisc’s work in the area
After lunch I presented with Paul Bailey and Michael Webb on Jisc’s activities in the area, discussing the architecture for a basic learning analytics system which is currently being procured, some of the ethical and legal issues for a code of practice for learning analytics, and plans for a student app.
Paul also announced that Jisc will provide funding to the Network for three small learning analytics projects of £5k each to be run from June to September, reporting back at the end of the year. Network members will decide on which proposals should be supported.
Group work
The final session of the day involved splitting into groups to discuss some of the issues of most concern to institutions, notably:
Interventions: we need to be open and transparent about these – and it should be clear when they will happen. Accurate interpretation of data is essential. The student needs proper support in order to take any suggested actions. Meanwhile, messages to students need to be managed carefully so they don’t have a negative effect. The intervention should be captured and measured so that institutions can find out what works.
Institutional adoption: in order to develop and roll out analytics at institutions the following need to be considered:
- Identify the stakeholders
- Fit any analytics project work into the institution’s business planning cycle – use something like a “quality of learning and teaching forum”
- Ensure that senior management sponsorship is secured and that learning analytics is prioritised against competing projects
- Put mechanisms in place to interpret the analytics and define interventions
- Convince academics and tutors that there’s something in it for them
- Identify genuinely valid analytics – not just things we’d like to do
- Identify the risks of learning analytics
A more practical suggestion was for Jisc to develop a checklist for organisations on how to implement learning analytics – including the “elevator pitch” to senior management.
There was a tangible enthusiasm among those present about the potential of learning analytics to improve the student experience, and we’ll be planning further events soon. To stay informed about future events you can subscribe to the analytics@jiscmail list.