LA Cymru curriculum analytics workshop, 28th November 2018.
Most learning analytics projects focus on improving student success, but the rich data sources increasingly available to us can also be used to gain insight into the effectiveness of the curriculum.
This is a relatively unexploited area which has the potential to significantly enhance our understanding of how to create engaging and effective curricula – to find out what is and isn’t working for students.
Curriculum analytics, as we’re calling it, is one area being investigated within the LA Cymru project . To take this forward Jisc organised a workshop in Cardiff and Vale College at the end of November with participants from Welsh universities and other institutions in the UK.
In the morning we asked participants what kind of things they’d want to look at in relation to enhancing the curriculum. A long list of use cases was assembled, followed by a quite complex process of grouping and categorising them.
We ended up with a number of categories and two overall groupings: those relating to the achievement of institutional goals and those related more specifically to learning and teaching.
Use cases relating to institutional goals included:
Improving NSS Actor: L&T Professional
Improving NSS scores by correlating low scores with module design elements – in order to identify common themes.
Quality assurance Actor: QA department
To have clear oversight of programmes and their outcomes – in order to focus resource and identify problems.
Resource usage Actor: Head of school
To clearly see how resources are used / allocated – in order to make adjustments to improve efficiency.
Cymraeg Actor: Academic manager
To identify students at risk of changing their language during learning – in order to reduce the probability of students changing language.
We categorised the use cases relating to learning and teaching into areas relating to course design, programme design, course review, assessment planning, learning and teaching, feedback, personalisation and technology design. I’ve selected examples from some of these areas:
Category: course design
Minimum standard for online provision Actor: Faculty or department head
To ensure that all modules have some VLE engagement opportunities – in order to ensure students have a consistent experience and online learning that is valuable.
Category: programme design
Data-informed programme Actor: Member of the planning committee
Access to evidence on what works in terms of learning and assessment appropriate to inform programme approval – in order to provide effective feedback to improve new programmes from the beginning.
Category: course review
Reporting service Actor: Head of school
A reporting service (dashboard) – can access metrics on course and module performance, outcomes, satisfaction rates, engagement, profile of students (HESA background), teaching style, cost, etc.
Category: assessment planning
Identify assessment methods that improve attainment Actor: Programme director
Identify assessment methods and assignments that truly improve the attainment of students and embed the learning – in order to inform programme and module design of evidence based best practice.
Category: learning and teaching
Improve modules Actor: Senior lecturer
To be able to compare student performances and engagement or particular academic tasks – in order to improve elements in modules and performance of individual students in modules.
Category: feedback
Feedback Actor: Tutor
To know what the impact of the feedback I have is – in order to improve future feedback.
Category: technology design
Apps Actor: Deputy Director of IT
Check use of purchased resources – in order to [use the time of my developers appropriately]
Those are just a few of the 63 use cases assembled. In the afternoon we broke into groups again and looked at the data sources which would be required for 6 of the use cases. We had 45 suggestions for data sources, ranging from attendance records to audience response tools to student surveys.
Here are a couple of examples of how the use cases were expanded.
How resources are used and are they effective?
Challenges: How we measure effectiveness; qualitative v quantitative analysis; blended balance between offline and online; how do we measure that learning outcomes have been met?
Risks: We go for the low hanging fruit and only measure what leaves a footprint
Use case: session based data collection points linked to session based learning outcomes
Data requirements: Interaction data during passive lectures; people: who is effective; good measurable learning outcomes; audience response systems linked to learning outcomes.
Understanding and evaluating learning and teaching
Aims: Use cases of what works/doesn’t work; sharing good practice; recommendations based on data for intervention.
Challenges/risks: Change management, may encourage superficial approach to L&T, agreeing taxonomy and tagging activity types.
Who? Learning developer; lecturer, HoD, students
Data: Teaching sessions; Content; Assessments; L&T approach, methodology; Engagement: VLE, content capture; Broader CA info to better understand impact vs other factors
We have a lot of ideas and material now to take curriculum analytics further. Other institutions and researchers have already made progress in developing methodologies and systems to analyse aspects of the curriculum but I’ve not come across a similarly comprehensive way to try to define the area.
Our plan is to take some of these ideas to the next Jisc Learning Analytics Research Group meeting at Keele University on 24th January, obtain the input of a wider group, and see how we can make them more concrete as a community.