Categories
Legal Issues Open learning analytics

Examining open learning analytics – report from the Lace Project meeting in Amsterdam

The latest Lace Project event was held in the illustrious surroundings of the Allard Pierson Museum at Amsterdam University this week.  The focus this time was on open learning analytics.  After some lightening presentations on participants’ interests in the area, we split into groups to look at what exactly open meant in the context of learning analytics. We discussed some of the issues with Adam Cooper of Cetis, shown here alongside other fine specimens of humanity.

adam-cooperProbably the most obvious aspect of open in the context of learning analytics is the reuse of code (and documentation) that others have created.  Predictive models can also of course be opened up to others.  Having open APIs in learning analytics products is important too – as is interoperability of the different systems.  And a vibrant community of people and institutions who wish to share and enhance all these things is essential.

Openness was thought to “keep the market honest” and it also improves transparency for learners and other users of learning analytics systems.  Openness may also mean that the data from one environment can be connected to that in another one, not only across the different systems in one institution but potentially with other institutions too.  Adam has documented some of the organisational intitiatives to share data for learning analytics .

In a group discussion later we looked at some of the next steps or “big ideas” for open learning analytics:

  • Clarifying the technical architectures – Apereo has defined an architecture and Jisc is commissioning the components of a basic learning analytics system
  • Developing a privacy-enhanced technology infrastructure
  • Student-facing tools to monitor and analyse their own learning progress
  • Tools to evaluate accessibility issues – an example was given of a system which determines if users are dyslexic and then adapts the learning accordingly

The other groups reported back on their proposed essential next steps:

  • Understanding the organisational consequences (or “systemic impact”) of deploying learning analytics
  • Gathering evidence for relevant specifications and standards that work or don’t work
  • Obtaining consent from students to help evolve learning analytics, instead of just agreeing that their data can be used (see Sharon Slade and Paul Prinsloo’s suggestions on students being collaborators in the learning analytics process rather than mere data subjects)
  • Building reference implementations of learning record stores
  • Understanding the barriers to the deployment of learning analytics

One of the clearest messages to come out of the day was just how important tackling the privacy issues is becoming in order to avoid an InBloom type fiasco  in Europe.  While this is a problem everywhere, concerns about the lack of systems for granting consent to the collection of data are massively holding up implementations of learning analytics in countries such as the Netherlands and Germany.  A further event being planned for Paris in February will attempt to progress understanding in this area and develop transparent learning analytics systems which include mechanisms to obtain consent at granular levels from students to process their data.

By Niall Sclater

Niall Sclater is Consultant and Director at Sclater Digital Ltd and is currently carrying out work for Jisc in Learning Analytics.

Leave a Reply

Your email address will not be published. Required fields are marked *