Categories
Institutional Use Network

Notes from UK Learning Analytics Network event in Nottingham

Our third network event took place last week in Nottingham.  This is the site of one of the biggest institutional roll-outs of learning analytics in the UK – at Nottingham Trent University – and we were keen to hear from Student Engagement Manager, Ed Foster, about it.

Nottingham city centre

But first we spent the morning listening to a series of pitches for funding for our micro-projects aimed at encouraging innovation in learning analytics.  Descriptions of these projects are in an earlier post. There were some great and varied ideas from a wide range of presenters and institutions. The presenters were grilled by a panel and by the audience, who then voted for their preferred projects over lunch.  After deliberations by the panel the winning projects were selected. My colleague, Paul Bailey, sums up the results:

Paul Bailey“All the Micro pitches were interesting and worthy of being supported but I tasked the panel with choosing three.  So the panel recommended the following four ideas should be supported and I now have approval to fund them all:

  1.  The call of duty visualised – workload planning tool, Open University
  2. Automated system for cognitive presence coding, University of Edinburgh
  3. Can survey data be an indicator of learning outcomes? University of Greenwich 
  4. Revolutionising teaching and learning, City of Liverpool College

We look forward to hearing what they have achieved in January where I hope we will revisit the University of East London for what will be the 5th Learning Analytics Network meeting.”

Ed FosterIn the afternoon we had an engaging presentation from Ed Foster about Nottingham Trent’s learning analytics initiative: What have we learnt from implementing learning analytics at NTU?

I’d been fortunate to visit the University during my investigation into the state of play of learning analytics in the UK last year so was looking forward to hearing the latest. NTU had three main aims for the project: improving retention, improving the sense of belonging and engagement, and improving attainment. A dashboard integrated biographical data such as the student’s enrolment status together with evidence of their engagement from door swipes, library loans, VLE use and assessment submissions.

The dashboard shows engagement across the cohort and rates students’ engagement ratings as High, Good, Satisfactory or Low.  There are separate views for students and staff.  An individual’s week by week engagement rating compared with the cohort can be viewed – and staff can add comments.  Alerts can also be raised for at-risk students.

The dashboard has now been rolled out across the institution and the majority of tutors are using it.  But does it make any difference?

Well first of all, Ed had found evidence of the oft-reported significant correlation between engagement and success. So if you’re a first year with “high” engagement you’re three times more likely to progress to your second year than if your engagement is “low”.  If you’re highly engaged in your final year you’re twice as likely to get a good degree as your peers with low engagement.

In general engagement is a far more important predictor of success at NTU than demographic data or entry qualifications. Disadvantaged groups still perform worse but engagement significantly increases their chance of academic success.

Tutors are now changing how they work with students because of the dashboards. They report having better information on individuals as well as their tutor groups, allowing them to target their interventions better. Meanwhile more than a quarter of students said the dashboard was changing their behaviour: increasing their attendance, taking out more books, accessing the VLE more and generally trying to improve their engagement ratings.

So has all this resulted in improvements to retention, belonging and attainment? In the pilot year, when a tutor put a note in the dashboard, a student’s engagement normally rose over the next seven days.  But it fell for students with high engagement ratings. Meanwhile progression increased on two courses and fell on the other two.  But there were other possible reasons for the fall in progression on those courses.  In other words: there are some promising changes in behaviour of both students and tutors but not yet conclusive evidence of significant impact.

Ed pointed out just how important communications are in general and how key the role of the
tutor is in affecting real change.  Without time, training and motivation they’re unlikely to be able to use the dashboard to help students significantly. As with all projects aimed at rolling out technology across an institution, it requires accompanying changes to culture and practice.

Michael WebbNext up was our colleague Michael Webb, who outlined the architecture Jisc is putting in place for learning analytics and described how institutions can get involved. Michael’s slides show the different options and timescales – and anyone who wishes to take part should formally register their interest by 5th July (which doesn’t commit you or your institution to anything at this stage). You should also sign up to the analytics group on jiscmail.ac.uk to ensure you receive further announcements.

 

John WhitmerFrom sunny Nottingham we travelled to sunny California where Dr John Whitmer from Blackboard had just woken up while we neared the end of an intensive day.  Connecting to us via Blackboard Collaborate, John’s presentation was entitled “Using Learning Analytics to Assess Innovation & Improve Student Achievement”.

John’s research has been driven by three “meta-questions”:

  1. How can we provide students with immediate, real-time feedback? (esp identify students at-risk of failing a course)
  2. How can we design effective interventions for these students?
  3. How can we assess innovations
    (or status quo deployments) of academic technologies?

He also wants to know whether these findings apply equally to students ‘at promise’ (as opposed to ‘at risk’) due to their background, based on e.g. race, class, family education, geography.

John talked about work he did at Chico State University where he found a significant positive correlation between hits on the VLE and grades obtained. He also, like Ed, had found engagement (in this case VLE accesses) to be a better predictor of success than demographic variables.  Looking at how total VLE hits correlate with grade he discovered that “at risk” students (based on demography etc) experienced an “over-working gap”, where students obtaining B grades were using the VLE more than those using A grades.  This was not the case with “at promise” students.

Next John discussed work he’d done with San Diego State University where the aim was to identify effective interventions which were driven by learning analytics “triggers” such as low attendance and low graded items.  “Flagged” students were then sent a notification or intervention.  The data was then aggregated, combined with demographic data and analysed.  John wanted to know whether the triggers were accurate predictors of course grade, whether the interventions improved grades, and whether this was influenced by the students’ background characteristics.

As other studies have suggested, the tone of any messages sent as interventions is important.  Here the typical intervention would use a “concerned friend” tone:

 … data that I’ve gathered over the years via clickers indicates that students who attend every face-to-face class meeting reduce their chances of getting a D or an F in the class from almost 30% down to approximately 8%.

So, please take my friendly advice and attend class and participate in our classroom activities via your clicker.  You’ll be happy you did!

Let me know if you have any questions.
Good luck,
Dr. Laumakis

John found that the number of triggers could predict achievement with 50% accuracy for students at Grade E but there was no significant effect for those receiving an A grade. While interventions appeared to make a difference, when the experiment was repeated the following year no significant difference in grade was found between experimental and control groups. However this disappointing result may be explainable by examining the statistics for the opening of the intervention messages and the clickthrough rate from those messages to further guidance. Both of these were significantly lower the second time.  The conclusion is that if the interventions had actually been taken up by the students they might have benefitted from them.

John also concluded that diverse sources of data from student technology use provide better predictions of student achievement.  Meanwhile demographics data, while being less effective at predicting success than usage data, gives nuanced understanding and helps to identify trends.  We’re at an early stage in learning analytics, he says, but be prepared for quantum leaps in the near future.

Our next network get-together will be at Bradford University and we’re looking at dates in October. Watch this space or sign up to the analytics jiscmail list for further details.

By Niall Sclater

Niall Sclater is Consultant and Director at Sclater Digital Ltd and is currently carrying out work for Jisc in Learning Analytics.

One reply on “Notes from UK Learning Analytics Network event in Nottingham”

I like your approach but multiple approaches can tend to overwhelm. You will need to differentiate more strongly between inputs (such as Teaching) and outputs (such as Learning). This is what I hope I achieved in my thesis on Student Learning Failure (Uni of Canberra) and papers on Retention. I wish you all the best.

Leave a Reply

Your email address will not be published. Required fields are marked *