Categories
Wellbeing

My Algorithmic “Friend” – by Andrew Cormack

In a workshop at last week’s AMOSSHE conference, we discussed how wellbeing analytics might be able to assist existing Student Support services.

Student support is simplest when an individual themselves asks for help: a support service can immediately begin to discuss – using toolkits such as that developed by UHI and AMOSSHE – what the problem is and how the university or college can help. Sometimes a friend will report concerns: in this case the support service needs first to work out how to contact the individual and find out if they do, indeed, need help. This must be done in ways that minimise risks to privacy, wellbeing and trust (in both the organisation and the friend).

It has been suggested that algorithms like those used for learning analytics might be able to act as “friends” for everyone in the university: raising alerts when data suggest there may be a wellbeing issue. This amplifies the challenges of human friend reporting – not least because we can’t discuss concerns with an algorithm – and expands the risks, as well as the potential benefits, to everyone, not just students with concerned friends.

The Jisc Learning Analytics Code of Practice and legal model seem to provide a good basis for this kind of Wellbeing Analytics, but both need to be adapted to deal with health (in legal terms “Special Category”) data. The rules derived from Legitimate Interests for “Analysis of Learning” need to be supplemented for “Analysis of Wellbeing” with those for Preventive Medicine in EU law, and for Public Interest/Confidential Counselling under the UK Data Protection Act 2018. The attached draft annex to the Learning Analytics Code of Practice includes these additional requirements.

Probably the most important point is that policies, data, systems, algorithms and processes need to be overseen by health professionals, though they can be operated by tutors and others. As a recent Guardian article observes, these processes look a lot like medical diagnosis, which is a regulated activity.

Data Protection law, too, is likely to consider wellbeing analytics a high-risk activity, requiring a formal Data Protection Impact Assessment to identify the risks to individuals and ensure they can be managed. Prior consultation with the Information Commissioner may also be needed. Wellbeing Analytics will require even greater care than Learning Analytics in describing, protecting and reviewing the data, processing and results.

Finally, thinking of algorithms as a “friend” highlights some particular concerns:

  • Beware of anything that could feel like “surveillance”: no one asks Big Brother for help;
  • Keep testing and refining the data and algorithms, just as you test and refine your guidance to human friends: remember that the algorithm will never be “right”;
  • Start from a robust process for human friends: strengthen it to cope with algorithmic “friends” who know everything about data, little about context and nothing about empathy.

If you have comments on the draft Wellbeing Analytics Code of Practice, please send them to <andrew.cormack@jisc.ac.uk>

By Niall Sclater

Niall Sclater is Consultant and Director at Sclater Digital Ltd and is currently carrying out work for Jisc in Learning Analytics.

Leave a Reply

Your email address will not be published. Required fields are marked *