A wide-ranging discussion took place in London last week to discuss the emerging Code of Practice for Learning Analytics. A new advisory group for the Code includes representatives from the National Union of Students, Edinburgh, Huddersfield, Lancaster and Loughborough Universities, Bucks New University, The Open University, Croydon College and Jisc.
A Code of Practice for Learning Analytics has been identified by Jisc stakeholders in higher and further education as a development priority. The literature review is the first stage in this activity and provides the underlying rationale. The next stage is developing the document itself.
The Code will aim to help to remove barriers to the adoption of learning analytics. We can ensure that its emphasis is positive, realistic and facilitative. It’ll provide a focus for institutions to deal with the many legal and ethical hurdles which are arising, and can be presented as an evolving, dynamic site rather than a lengthy one-off document which hardly anyone reads, let alone adheres to.
Jisc will coordinate the development and roll-out of the Code. Meanwhile advisory group members agreed to critique the Code as it’s being developed and consider piloting it at their own institutions.
Methodology and approaches
Some documents take a particular methodological or philosophical stance. For instance Slade and Prinsloo’s socio-critical approach – where learning analytics is viewed as a “transparent moral practice” and students are seen as co-contributors – has influenced the Open University’s Policy on Ethical Use of Student Data. Should the Code take such an approach?
One of the main challenges will be to strike a balance between a paternalistic approach and respecting students’ privacy and autonomy. It was suggested that the various uses for student data might have different approaches to consent:
- Helping individual students based on their own data
- Merging individuals’ data with those of others to help the group
- Using data to help future cohorts of students
Informed consent could potentially be obtained for each of these options.
There was also concern expressed about ensuring that any sharing of student data outside the institution should be carefully controlled. The Code itself should have boundaries and may need to reference other institutional policies. There should be differentiation between demographic and behavioural data, and the “right to be forgotten” needs to be addressed.
A separate document for students?
An approach which puts the needs of learners at the heart of the Code is surely likely to result in a better and more widely-adopted document which helps to allay the fears of students and institutions and facilitate the uptake of learning analytics. The inclusion of the NUS in this group is therefore particularly welcome.
There will need to be a balance and series of compromises struck however to develop a usable Code and encourage mutual understanding. The group decided a single document setting out clearly the rights and responsibilities of students, institutions and staff would be preferable to having a separate student “bill of rights for learning analytics”.
Explaining what the Code means in practice however may require separate advice for different stakeholders. At institutions the Code should link closely with the student charter, and involve buy-in from the students’ union.
Striking a balance between high level principles and detailed guidance
Can the Code be sufficiently high level to meet the needs of all institutions while remaining specific enough to provide genuinely helpful guidance? It was very clear from my institutional visits that the potential uses of learning analytics and the concerns raised varied widely across institutions. The group thought that the document should be fairly high level in order to prove useful to all, but should be backed up by case studies and examples of how institutions have dealt with particular issues. The case studies could be released alongside the code – for each principle there could be examples of good practice.
Conformance with the Code
Another question I posed to the group was whether we should encourage institutions to adopt the Code wholesale, and therefore be able to claim conformance with it, or to customise it to their own requirements? We probably need to see the end result first but it was felt that institutions might want to be able to adopt the Code with local modifications.
Human intermediation
Particular concern was expressed that the Code needs to reflect the human context and the need for intermediation of learning analytics by staff. This is a common ethical theme in the literature. However a representative from the Open University said that the sheer scale of that institution makes it unfeasible to use human intermediation for many of the potential uses of learning analytics. Meanwhile there was real concern among members that the language which is used to present analytics to students should be carefully considered and that data is only exposed when institutions have mechanisms in place to deal with the effect on students. The potential impact of analytics on the educator also needs to be reflected in the Code.
Format
All of the related codes of practice I’ve looked at are textual documents – normally provided in PDF. The members felt that a document outlining the principles needed to be provided in order to present it to institutional committees but that an interactive website containing case studies, perhaps in the form of videoed interviews with staff and students, would be welcome.
Some codes are extremely lengthy and somewhat uninspiring papers stretching to thirty pages or more. One of the better formats I’ve seen is the Respect Code of Practice for Socio-Economic Research. It’s concise – only four pages – and reasonably visually appealing, therefore arguably more likely to be read and absorbed by busy people than some of the longer codes. However, given the large number of issues identified in our literature review, four pages is unlikely to be sufficient.
One approach would be to back up a concise summary document with more detailed online guidance for each of the areas. Discussion forums could be included on each topic, enabling users to raise further issues which arise, and others to provide advice on how they’ve tackled that challenge. This would need some ongoing promotion, facilitation and moderation by Jisc and/or members of the community.
Areas to be included
The literature review covers most of the ethical and legal issues which are likely to be of concern to students and to institutions when deploying learning analytics, though there may be some which I’ve missed or have not yet cropped up in the literature. The section headings and the word clouds in the review could help prioritise the main areas to be included in the Code. It was pointed out that it would be difficult to deal with all of these meaningfully within four pages but certainly each area could be expanded on in the supporting documentation.
Including vendors
One member suggested including vendors in the consultation process for the Code. It might help them when making development decisions, for instance encouraging them to build consent systems into their products. The Code could help to ensure that safeguards, such as ensuring privacy, are built in without holding back innovation.
Development process
Jisc will develop the Code up until May 2015 with guidance from the advisory group. Supporting content e.g. videoed interviews can be developed subsequently, help raise awareness of the Code, provide examples of how it’s being implemented and help to keep it current.
A sense of ownership by institutions and by students is essential to ensure adoption. How can this best be achieved? A range of stakeholder organisations was proposed and a number of possible events to piggy-back on were suggested. Several members said they’d be keen to try piloting the Code at their institutions too. An experiential learning cycle was suggested, with institutions thinking about:
- What’s the ethical/legal issue?
- What’s the principle to deal with it?
- How did we apply the principle?
Roll-out and dissemination
There is already considerable awareness of the intended Code of Practice but how should it best be disseminated once developed? One member suggested it would be useful to understand better the processes inside institutions for getting academic policies adopted as this will be key to uptake. In addition, a couple of events specifically around the Code could be held, papers delivered at relevant conferences and approaches made to newspapers to see if they’d like to cover its launch. It was felt that the Code should be launched with some fanfare at a larger event to increase awareness and potential take-up.
Now on with developing it… Comments are welcome.