Categories
Alert and Intervention System Architecture Code of Practice Discovery Stage Learning Analytics Processor Metrics Network Systems

Notes from UK Learning Analytics Network event in Bradford

University of BradfordAround 90 people signed up for our latest event which took place last week at Bradford University.  Trevor Meek and his colleagues provided the venue, and we spent most of the day examining aspects of Jisc’s architecture for learning analytics and looking at ethical, legal and adoption issues for institutions.

Michael Webb started off the day with a demonstration of the learning analytics solution Jisc is putting into place for UK further and higher education (see earlier blog post). He showed how data is transferred from a VLE (in this case a Moodle instance hosted by Jisc) to the learning analytics warehouse (Learning Locker) and can then be picked up for processing by one of two solutions: the Student Success Plan system from Unicon and Marist College, and the other developed by Tribal.

Steven Rowatt of UCL has written a good blog post which goes into more depth on Michael’s presentation.

One of the issues we wanted feedback on was which indicators of engagement people at the meeting thought were the most appropriate.  Participants split into groups and came up with the following:

  1. Attendance
  2. VLE engagement (needing flexibility to adapt to different requirements for use in different modules)
  3. Pre-entry data – to measure different ways into the institution e.g. Was this institution their first choice? Did they come through clearing?
  4. How students are interacting with others in the learning process e.g. participating in collaborative activities through forums and blogs

None of these are a great surprise – and they already form the basis of a lot of learning analytics implementations. A new one though, which we happen to be building into the Student App, was:

  1. How well students are doing against their own goals

There were also some interesting factors that people thought could be influential on student success which I’m not aware of anyone using yet in the UK for learning analytics. Some of these, in my view, would require consent from students:

  1. Are they working, do they vote in union elections, how many contact hours do they have, are they commuting, do they have health issues, do they have caring responsibilities?
  2. Social engagement and the take-up of extra-curricular opportunities; membership of societies
Michael Webb
MIchael ponders the complexities of learning analytics

I suggested that we need to throw all the data we have at the predictive engines and find out what they tell us are the most important factors. Chris Ballard from Tribal qualified this by pointing out that you can’t identify the best indicators solely through an automated process and said you also need to bring in people with domain expertise. The algorithm then prioritises those. So it should be a combination of human expert and automated processing.

Trevor was next, and discussed change management in his presentation: Learning Analytics from vision to implementation (PPT – 5.11Mb).  Trevor has discovered a lot of confusion between learning analytics and business analytics but he sees them as quite distinct areas.  He suggested that the first questions a senior manager would ask would be what benefit is LA to the students and how does it impact on the strategic business of the institution?

He got everyone brainstorming their top strategic challenges with implementing LA. He suggests some of the main ones are: acceptance, variability, belief, buy-in, costs (/benefit), security, legality and longevity.  He then got us thinking about how to overcome these issues before we looked at more in-depth operational challenges.

Trevor Meek
Trevor illuminating us on strategic issues

One of Trevor’s main arguments was that we need to get cracking with learning analytics. We have to start somewhere and not try to solve all the issues beforehand or we’ll never get there.

After lunch Paul Bailey and Michael discussed the Discovery Process (see earlier blog post) we’ve put in place to help institutions assess their readiness for learning analytics. Two options are available:  Unicon/Marist or Blackboard. Both provide three days of events and interviews at the institution.

We’ve got eight institutions at the moment on the Discovery stage and hope to have around six of them completed by Christmas.  We’ll share some anonymised data on that with the sector after that.  Around 50 institutions have submitted an expression of interest to join the programme.

As an aside, I’ve just accompanied Blackboard on one of their institutional visits and thought they did a great job of getting stakeholders from across the institution talking about the key things they need to put in place for LA. This is the consultancy area of their business rather than the sales side, and they were genuinely vendor-neutral too.

I was up next and spoke about the Code of Practice and some of the issues around implementing it (PPT – 3.09Mb).  I posed the following questions for discussion in smaller groups:

  • Who is responsible for learning analytics in your institution?
  • Should students be asked for consent to use their data for learning analytics?
  • Does/should someone in your institution understand the algorithms and metrics?
  • Can students access all the data held about them in your institution?
  • Should interventions always be human-mediated?

20151019_102057

Conversations were animated and it was hard to get people to stop talking in their groups after each question had been discussed for a few minutes – always a good sign. I’m currently putting together some back-up materials for the Code so stay tuned for updates on that if you’re struggling with some of these issues at your institution.

Dean Machin from Trilateral Research & Consulting followed me with a useful session on privacy impact assessments for learning analytics (PPT – 2.3Mb). This was a really interesting discussion of practical ways institutions can attempt to ensure that they have covered all the privacy aspects. Dean suggested that the key questions to ask are:

  • What’s the data collected for?
  • Is it required for the purpose?
  • Will it be used for other purposes?
  • Are the relevant stakeholders aware of these other purposes?
  • Who will have access to the data?
  • Do students know/accept how the data will be used?
  • To whom will the data be disclosed?

Finally, Paul presented the latest developments around our Student App (PPT – 15.3Mb – also see earlier blog post). The app is currently being developed and will be available for trials in April.

Our next network event is likely to be in London in January or February so we’ll be able to update you on the latest Jisc developments in LA then, as well has having a selection of outside speakers discussing interesting developments elsewhere.

Students having lunch at the University of Bradford
Atrium, University of Bradford

By Niall Sclater

Niall Sclater is Consultant and Director at Sclater Digital Ltd and is currently carrying out work for Jisc in Learning Analytics.

Leave a Reply

Your email address will not be published. Required fields are marked *