Categories
Institutional Use Network

Notes from the Learning Analytics Network event in Edinburgh

On a fabulous spring day last Friday around 44 people made it to Edinburgh for the second meeting of the Learning Analytics Network, jointly organised by Edinburgh University and Jisc.

Edinburgh Network Event - the audience

I suspect a number of us may have been somewhat bleary-eyed after witnessing the political landscape of the UK being redrawn during the previous night.  However the quality of the presentations and discussion throughout the day seemed to keep everyone awake.  It was particularly interesting to hear about the various innovations at Edinburgh itself, which is emerging as one of the pioneering institutions in learning analytics.

DraganEdinburgh’s visionary approach is highlighted by the recent appointment of Dragan Gašević as Chair in Learning Analytics and Informatics, and it was great to have Dragan give the first presentation of the day: Doing learning analytics in higher education: Critical issues for adoption and implementation PDF – 1.21MB.

Dragan outlined why learning analytics is increasingly necessary in education and examined some of the landmark projects so far such as Signals at Purdue and the University of Michigan’s development of predictive models for student success in science courses.

In an Australian study Dragan was involved with he found there is a lack of a data informed decision making culture in universities and that, while researchers are carrying out lots of experimentation, they are not focussed scaling up their findings. Finally Dragan looked at ethics and mentioned the Open University’s policy and Jisc’s (soon to be released) Code of Practice for Learning Analytics.

SheilaNext up was Sheila MacNeill on Learning Analytics Implementation Issues (Presentation on Slideshare). Sheila gained expertise in learning analytics while working for Cetis and has now been attempting to put this into practice at Glasgow Caledonian University. On arriving at the institution 18 months ago she found it was difficult to get to grips with all the systems and data of potential use for learning analytics. She started by identifying the areas: assessment and feedback, e-portfolios, collaboration and content.  This data is hard to get at and needs a lot of cleaning to be able to be used for learning analytics.

Sheila’s summary slide outlines the main issues she’s encountered:

  • Leadership and understanding is crucial – you need both a carrot and stick approach.
  • Data is obviously important – ownership issues can be particularly problematic.
  • Practice can be a challenge – cultural issues of sharing and talking are important.
  • Specialist staff time matters – learning analytics has to be prioritised for progress to be made.
  • Institutional amnesia can be an issue – people forget what’s been done before and why.

Wilma

Zipping back to the East Coast, Wilma Alexander talked about Student Data and Analytics Work at the University of Edinburgh (PDF – 866kB).  She discussed attempts to use Blackboard Learn and Moodle plug-ins for learning analytics, finding that neither of them were designed to provide data to students themselves. They then collected 92 user stories from 18 staff and 32 students. Much of what people wanted was actually already available if they knew where to look for it. Students wanted to understand how they compare with others, to track their progress, and to view timetables, submissions and assessment criteria.

The next presenter, also from Edinburgh, was Avril Dewar: Using learning analytics to identify ‘at-risk’ students within eight weeks of starting university: problems and opportunities (PPT – 351kB). Avril discussed her work at the Centre for Medical Education to develop an early warning system to identify disengaged students. 80% of at-risk students were identified by the system. Metrics included: engagement with routine tasks, completion of formative assessment, tutorial attendance, attendance at voluntary events, and use of the VLE.

Yet another Edinburgh resident, though this one working for Cetis rather than the University, was next.  Wilbert Kraan presented on The Feedback Hub – where qualitative learning support meets learning analytics (PPT – 1.86MB). The Feedback Hub is part of Jisc’s Electronic Management of Assessment project, working with UCISA and the Heads of eLearning Forum. It aims to provide feedback beyond the current module, looking across modules and years.  Wilbert proposed that feedback related data could be a very useful input to learning analytics.

My colleagues Paul Bailey and Michael Webb (most definitely neither from Edinburgh) and I (from Edinburgh originally!) then updated attendees on progress with Jisc’s Effective Learning Analytics programme (PDF – 318kB).  In particular we described the procurement process for the basic learning analytics system (which will be the subject of further publicity and another imminent post on this blog) to be made available freely to UK universities and colleges.  We also discussed the Discovery Stage where institutions can receive consultancy to assess their readiness for learning analytics. Paul concluded by mentioning the next Network Event at Nottingham Trent University on 24th June (Booking Form).

Later we had several lightening talks, the first from Prof Blazenka Divjak of the University of Zagreb, though currently a visiting scholar at, you guessed it, the University of Edinburgh.  Blazenka presented on Assessment and Learning Analytics (PPTX – 385kB). She’s found the main challenge in learning analytics to be the management and cleansing of data.  She discussed two projects undertaken at Zagreb.  The first examined the differences in performance between groups based on gender, previous study etc. The second analysed the validity, reliability and consistency of peer assessment.  She demonstrated visualisations which allow students to compare themselves with others.

Paula Smith from Edinburgh gave another interesting lightening presentation on The Impact of Student Dashboards.   She reported on an innovation in their MSc in Surgical Sciences which expanded on existing tracking of students via an MCQ system to create a student dashboard. This allowed them to monitor their performance in relation to others, to provide information on at-risk students to staff and to evaluate the impact of any interventions that took place as a result. Most students welcomed the dashboard and many thought they would want to view it monthly.

Finally, Daley Davis, from Altis Consulting talked about what his company is doing in Learning Analytics (PDF – 663kB). Altis is an Australian company and Daley discussed how Australian institutions are extremely focussed on retention due to the funding regime. Working with the University of New England, Altis cut attrition rates from 18% to 12%.  A student “wellness engine” was developed to present data at different levels of aggregation to different audiences. Data used included a system which asked students for their emotional state.

In the afternoon we split into groups to discuss the “burning issues” that had emerged for us during the day.  These were:

  • Make sure we start with questions first – don’t start with a technical framework
  • Data protection and when you should seek consent
  • When to intervene – triage
  • Is the data any use anyway?
  • Implementing analytics – institutional service versus course/personal service
  • Metrics and reliability
  • Institutional readiness / staff readiness
  • Don’t stick with deficit model – focus on improving learning not just helping failing students
  • Treating cohorts / subject disciplines / age ranges differently
  • Social media – ethics of using Facebook etc for LA
  • Can’t not interpret data just because there’s an issue you don’t want to deal with
  • Using LA at either end of the lifecyle
  • Ethics a big problem – might use analytics only to recruit successful people
  • Lack of sponsorship from senior management
  • Essex found through student surveys that students do want analytics

I’m immensely grateful to Nicola Osborne for her comprehensive live blog of the event from which this summary draws heavily. I wish there was a Nicola at every event I attended!

By Niall Sclater

Niall Sclater is Consultant and Director at Sclater Digital Ltd and is currently carrying out work for Jisc in Learning Analytics.

One reply on “Notes from the Learning Analytics Network event in Edinburgh”

Thanks for this summary Niall, and for your kind comments – I’m very glad my notes were useful.

Thank you again for an excellent event last week – lots of great presentations and discussions to digest!

– Nicola.

Leave a Reply

Your email address will not be published. Required fields are marked *