Gathering requirements for a student app for learning analytics

What data and analytics should be presented directly to students? This was the subject of a workshop on 27th Feb in Jisc’s London offices, attended by staff from across UK higher education.  We also had with us a couple of students with a keen interest in the area.

LondonIn advance of the meeting members of Jisc’s Student App Expert Group had anonymously provided a range of potential requirements, which I then grouped and used as the basis for the workshop (they’re included at the bottom of this post for information).

The first area is around information provision to students, and comprises functionality for:

  • Monitoring your academic progress
  • Comparing your progress to others or to your prior performance
  • Monitoring your engagement with learning
  • Providing useful information such as exam times and whether there are free computers available

The second area is concerned with action – the student actively entering information or doing something to enhance their learning. It consists of:

  • Prompts and suggestions
  • Communication facilities with staff and students
  • The upload of personal data
  • Providing consent to what data is used for learning analytics

Various other issues were suggested relating to the interface (e.g. ensuring it is easy to use), ethics (e.g. being aware that the app can only ever give a partial view of student progress), and data (e.g. accepting data from a wide range of sources).

During the day, groups discussed a number of these areas for functionality. For each we defined an idea, a purpose, benefits, drawbacks & risks, and presentational aspects. Some of these ideas are fairly wacky and might not survive further interrogation or prioritisation but here they all are for the record. The next stage is to run the ideas past students themselves to find out what they want to see in an analytics app.

How engaged am I?
The most common application of learning analytics is measuring student engagement. Putting this information in the hands of the learners themselves could help to reassure those who feel they’re on track and prompt those who aren’t engaging. There’s always the risk of course that students will game the system to achieve better engagement ratings without actually improving their learning. However it could also result in them finding the library, attending more lectures, using the VLE more or reading more books.

An idea for presenting this information was to show overall engagement on a scale of 1 to 5. Clicking the indicator would result in a further breakdown for e.g. library usage, lecture attendance and VLE usage. VLE usage might be further broken down if required, showing forum participation perhaps if that was considered important. Data could be shown by module as well as across modules.

Compare my engagement
Learners’ engagement could be compared with that of their peers or even their own past performance. Again this could be potentially motivating and inspire students to change their behaviour. The risks include being demotivating, falsely equating engagement with success, and privacy issues e.g. the identification of individuals on small cohorts from anonymous data.

How am I progressing?
The aim here is to gather and surface academic progress indicators and to identify actionable insights for the student. Timely information would aim to change their behaviour and improve achievement. Having all the information in one place would be beneficial but would there be enough information to enable students to take action? One risk is that this could “kill passion” for the subject and further divert effort into assessed or measured activities. Providing context would also be important – a grade without feedback may not be helpful. It also could be counterproductive for high performing students. Meanwhile raised and unfulfilled expectations could result in worse feedback for institutions on the National Student Survey.

Data could be presented on a sliding scale, showing whether they were likely to pass or fail and allowing them to drill down into more granular detail on their academic performance.

Compare my academic progress
This functionality would allow students to compare where they were in key activities with previous cohorts and with peers. It could aid those who lack confidence and help them to realise that they are doing better than they realised. Of course it could also damage their confidence. Another risk is that the previous cohort might be different from the current intake or the way the course is being taught might have changed.

My assessments
A possibility would be to show analytics on what successful students do and how your actions compare e.g. if students submit assessments just before the deadline are they more likely to fail? This might result in students being better prepared for their assessments.

My career aspirations
The aim here would be to help understand whether the student is on track to achieve their chosen career based on records of previous learners. This might include networking opportunities with students who have already followed a similar path. It might help to increase engagement and with module pathway planning. Students could talk about their skills and better understand how to quantify them.

Meanwhile suggestions such as “you need to know about teamwork” or “identify opportunities for voluntary work” could be provided. The app might also suggest alternative career paths or that a student is on the wrong one e.g. “your maths does not appear to be at the level required for a nuclear physicist”.

Risks include that the app could be overly deterministic, restricting learner autonomy – and that students would need to ensure that their data was up to date.

Plan my career path
A related possibility is showing what educational journey a student needs to take to achieve their intended career, helping them to avoid the wrong choices for them e.g. what does the life of a midwife look like and what was their educational journey to get there?

My competencies
Another idea discussed was to enable students to monitor their competencies and reflect on their skills development, perhaps through some sort of game. This could encourage them to engage better with the materials and with their cohort. Again this wouldn’t of course guarantee success.

My emotional state
Enabling students to give an idea of their emotional state in some way would allow them to gauge how they were compared to their peer group, and to provide better feedback to the institution or to tutors.  This is highly personal information of course and you might want it to be visible to you only, unless it is anonymised.

Why I didn’t attend
The app could allow students to input their reasons for non-attendance e.g. “I didn’t attend this lecture because I had my tonsils out” and “but while recovering in hospital I watched the lecture video and read the lecture notes”. This might enable the adaption of engagement scores so that students felt they reflected the real situation.

We looked at whether the app should include communications facilities around the analytics. This might between students and tutors or perhaps with peer mentors. There was concern that this might be mission creep for the app however integrating communications around the interventions taken on the basis of the analytics might be useful. The app could also provide information about opportunities for communications around student support, with personal tutors, study buddies, peer mentors or clubs.

There would be potential for communications based on the programme rather than just the module, and the functionality might for instance be used to facilitate the take-up for personal tuition. The tools available might depend on the level of the students e.g. encouraging those on a one year taught Masters. One issue raised was that there would be student expectations of a quick response, and this might result in even more email “tyranny” for academics.

Link app to my social media accounts
The idea here is to enable students to link the app to a Twitter, LinkedIn or other social media accounts so that you can send status updates from the student app. This would enable the aggregation of for example of Twitter feeds from all those on the module with Twitter accounts, allowing learners to connect better with others. The institution could use the data for sentiment mining and updates could be fed to the lecturer, even while they’re giving the lecture.

Give my consent for learning analytics
In order ensure the legal and ethical collection and use of student data for learning analytics, a key part of the learning analytics architecture Jisc is commissioning will be a consent system, which is likely to be controlled from the student app. This could be particularly important in some of the more personal applications such as linking to your social media accounts or inputting your emotional state. It will also help users to understand what is being done with their data, feel a sense of control over the process, and help to reduce concerns that data could be misused. It would allow students to control any third party access to their data e.g. by future employers.

My location
Providing geolocation data to the app could have a number of applications such as helping vulnerable students to feel safer, campus mapping and self-monitoring. It could help institutions by enabling the tracking of the use of services. Students might also be prompted to attend campus more or spend more time in the library. This does of course have privacy implications and access to location data would need to be strictly controlled (by the student). It would also generate large quantities of data.

Fun analytics
The aim here would be to motivate and engage, and to get students to use the app, by providing fun or amusing analytics. Options discussed included “calorie burner info” e.g. “you read 2 articles today and used 5 calories”; a campus induction game; weekly challenges based on activity and studies; and a badge system of rewards.

Where next?
A recommendations engine could be presented through the app, providing relevant offers, signposting and information to students. Again this could potentially result in increased engagement, driving students to helpful services. On the downside it could be intrusive, add to information overload, and be used for marketing rather than benefitting learning.
Information could be presented on what’s trending, forthcoming local events, and silly facts e.g. “30% of students who eat here get a 1st class honours!” This could help students to be better informed and prompt them to do something they might not have before.

My students union
Increased engagement with the students union can help learners to feel better connected so the app could also be used to facilitate this by showing events and information – and potentially engage them more in the democratic process.

Car park
We parked a number of ideas during the day to return to perhaps at a later stage, including: assessment regulations, tutor performance, data literacy, the naming of the app, and how we get disengaged students to use it.

Suggested functionality for the app
The following possible functions were suggested by members of the Student App Expert Group in advance of the session and then expanded on in the discussions, as summarised above. This provides a good checklist of what we might wish to consider including:

Monitoring academic progress

  1. Progress. What percentage of the course materials, activities, formative assessments etc. have you done?
  2. Student should be able to see their progress with clear indication whether they are at risk or not
  3. Show students their academic progress, at a granular level: what marks they have for each assignment and how that contributes to their overall progress
  4. Ability to track own academic progress – get marks, compare own marks across modules and years
  5. Monitor student progress (provide overall picture of student performance and alert to potential problems)
  6. Could there be an area showing their student performance?
  7. Real-time, or near real-time updates on progress
  8. At a glance views of progress against criteria (such as assessment), links to personal progression tracking, and ‘useful’ traffic-light style
  9. Overview of essay marks, including marks for research skills, writing skills, originality etc. -> ability to compare to previous essay marks
  10. Access to formative and summative marks, and feedback
  11. Performance data: grades
  12. Performance data against entry profile and previous performance
  13. An integrated view of a student’s study career, from the programme level to the course/module level
  14. What does the rating mean?

Comparing academic progress

  1. Academic “performance” in relation to others on cohort, possibly to previous cohorts and grade predication
  2. Crucially, should be able to compare their data both to themselves over weeks/months/years of study, but also to the ‘average’ behaviour of the cohort with whom they study.
  3. Answering the question: “Am I in line with my cohort, both now and preferably historically too?”
  4. Comparison. How is your progress compared to others – in the class, best in class, last year’s class etc?
  5. Leaderboards? Actually I hate them but my research shows that for some classes of student they do encourage engagement.
  6. Benchmarking the student academic performance with peers
  7. Ability to compare essay marks to average marks of cohort
  8. Where would 1st class degree attainment be on the line – and 2nd class, 3rd class and so on?

Monitoring engagement

  1. Look at interactions/activity they have taken part in on VLE and/or other systems, number of journals accessed online/in the library
  2. Activity data on attendance, VLE usage and library…and if there are appropriate comparators then that

Useful information

  1. A ‘calendar plus’ function that tells you not just what your lectures are for the day, but what other activities there are around campus – sports classes, clubs, if certain lecturers have office hours, if there are free computers in the computer lab, etc. Needs to both respond to where you are on campus, as well as make suggestions based on how much time you have to spare as well as where you are at the moment. For example, ‘You have an hour until your next lecture – why not boost your ‘library score’ and visit there for a little while, or go talk to Professor Blogs since she has office hours’.
  2. Have information on the university’s important events and useful revision techniques
  3. Easy and better access to learning resources

Prompts and suggestions

  1. Student should know what to do next
  2. Provide a visual representation of the chosen metric at a granular enough level that student activity can clearly precipitate change
  3. Potential outcomes and necessary effort – predict 2:2 do this well here and here and get a 2:2
  4. Recommendations of training courses and resources based on essay marks
  5. Prompts to regular self-assessment of research skills, writing skills, presentation skills etc. -> allow students to take responsibility for learning/progress
  6. Provide students with a ‘to do’ list, showing what they have to do and by when. The difficulty here is making it all inclusive
  7. Right now immediately after reading this text, what do you expect students to actually do
  8. Give students access to people who can help them and identify the specific kinds of help that can be provided
  9. Tips to improve performance, what to do next
  10. Gives information on how to improve not just/only status
  11. Diagnostics. The system should be able to see where I’m not doing well and point me to support materials. E.g you don’t seem to be doing well at this bit if the syllabus – or you don’t seem to be doing well at more analytic questions…
  12. A recommendations aspect based on past use (and how others behave) – based on this module/this paper/this time of studying, we recommend that you consider this topic/this other article/this prime study time
  13. Have information on ways they could improve their student engagement


  1. A ‘question’ function to send concerns to the academic personal tutor or other intermediary
  2. Identify effective communication strategies
  3. Facilitating interactive and better Communications with academic and admin staff
  4. Ability to communicate between staff-student, student-student

Upload of personal data

  1. Ability to load personally captured data to provide context and information
  2. Allow students set their own notifications – which may be alerts, reminders, or motivating messages – triggered by circumstances of their choosing. (Making good decisions about this would need facilitation, but would help towards metacognitive awareness and understanding of the data and the software themselves).


  1. A way of opting in or out of sharing the data, or aspects of the data, with staff
  2. Granular control of who sees what – controlled by the student

Other issues

  1. The student app should be easy to use
  2. Easy access to visual information
  3. Provide a visual representation of the chosen metric at a granular enough level that student activity can clearly precipitate change
  4. Whatever the outcome is for the learning analytics app, I’d try and keep the core interface simple. I’d personally prefer one graphic ultimately, but I’m sure there are arguments for a range of options
  5. cross platform -brandable (the students may know their institutional brand but perhaps won’t respond to something plastered in jisc branding)
  6. Access to the underlying data, but also good conceptually-straightforward visualisation of that data.
  7. Analytics visualisations that will prove compelling for students to visit the app.
  8. Cross platform /device so all can access


  1. Transparency about the gaps – the app should avoid over-determining – or giving the impression of over-determining – students’ progress and achievement based on data, which is inevitably an incomplete representation of learning but which may carry more weight than the ineffable or unrecorded moments of learning.

Data sources

  1. Accept data from a range of simple or aggregated end-points – I appreciate it is likely to accept a feed from the Jisc basic LA tool, but it would be useful if we could provide a feed from the basic data we have in Blackboard ASR

Impact on teaching

  1. Identify effective teaching and assessment practices

By Niall Sclater

Niall Sclater is Consultant and Director at Sclater Digital Ltd and is currently carrying out work for Jisc in Learning Analytics.

One reply on “Gathering requirements for a student app for learning analytics”

This is a really thorough airing of issues and opportunities. Thank you. I loved the idea of the fun analytics, which could be a real winner. I am concerned, however, about the emotional states mapping and ‘sentiment mining’. I do agree that a trial of volunteers looking at the impact of emotional states on engagement and learning outcomes would be fabulously interesting. Extending this past a group of volunteers for research purposes enters the realm of unacceptable obtrusion into a personal and intimate space. For example, learners who see a connection between states of anxiety or sadness and lower engagement, but have anxious personalities are not going to be supported by this element. What might be supportive, if carefully handled, is evidence-based findings on key attributes/states of mind for learning engagement, with advice on healthy living and how to raise mood etc. But this is not in the domain of the student app – it is more general advice to be given through a different route.

Leave a Reply

Your email address will not be published. Required fields are marked *