Categories
Institutional Use Policy Trends

Trends Unpacked (Part 3): Technical Challenges and Learning Analytics

This is a guest post from Lindsay Pineda, Senior Implementation Consultant and Amanda Mason, Senior Business Analyst, Unicon, Inc.

How are you approaching a learning analytics implementation at your institution? Is the impression that it is mostly a technical implementation? Or is it mostly an organizational/cultural one? As we discuss in this installment of the “Trends Unpacked” series, it is actually both and it is important that your institution believes this as well.

Amanda Mason, a Senior Business Analyst with Unicon, has extensive experience in recognizing technological challenges related to systems integration, investigating technical requirements, and strategic analysis. All of these skills have been a vital piece of connecting the technical side of a learning analytics implementation to the organizational one. You can read more about Amanda’s experience in her bio at the end of this article.

In the past few “Trends Unpacked” articles, the focus has been mainly on the organizational challenges, observations, and recommendations. In this third installment, Amanda and I are going to focus on the first three aspects of technical challenges and trends:

  • Demonstration of sufficient learning analytics knowledge
  • Institutional infrastructure
  • Data management

In future posts, we will cover the remaining technical challenges and trends.

Demonstration of Sufficient Learning Analytics Knowledge

As highlighted in the article “Learning Analytics Adoption and Implementation Trends: Identifying Organizational and Technical Patterns,” some of the challenges and trends we observed were related to demonstrating sufficient learning analytics knowledge:

  • Many individuals within technical departments expressed concern about their collective learning analytics knowledge. Most felt they had some knowledge; however, much of it had been focused on institutional analytics.
  • The bulk of the time spent on education about learning analytics was experienced within the technical departments themselves.

The following examples illustrate the types of learning analytics knowledge challenges most often expressed at the institutions:

  • What do we do with the data? – This was a common theme and expressed by many at the institutions we visited; at one particular institution we were told, “How is this different than what we already do? We already have loads of data, we just don’t do anything with it.” This is a consistent statement among institutions. Most do have loads of data they have been collecting for decades, but that is not the difficulty. The difficulty lies in where the data is located. We found that data is located in several different systems, departments, and often stored within the minds of tenured individuals who are sought out to advise in specific situations. Some viewed this robust wealth of data as a positive thing because there was so much information being captured about students. However, the fact remains that a lot of data does not equal a lot of knowledge about how to use it. As one institution pointed out, “There is the perception that having loads of data as a good thing, but we’re not sure if it’s at all useful or valuable.”
  • We need a systematic way to collect data – At other institutions, we experienced concern from a technical group of individuals who voiced frustration regarding the sheer volume of data collected. They expressed that it is difficult to determine how to collect the right data in the right format for the right practices, from all of the varying systems they currently have in place. This same group advised us, “We need to be very clear about the data we collect currently and how this is different than what we are going to collect in the future. There needs to be a structure in place to systematically collect data moving forward.”

Institutions shared some ideas regarding potential solutions and recommendations that they feel would be beneficial:

  • Define a clear purpose for data collection – Institutions recognized the need to use data to help inform effective delivery of a positive student experience. One institution told us, “We need to do something with the data. We can’t just let it sit there.” Questions need to be asked in regards to several individuals and departments to determine where priorities lie for data collection. Having a clear idea of what data should be collected and for what purpose is imperative. This ensures the processes and policies will reflect the goals of the institution as it pertains to learning analytics. As another institution communicated, “We have a good sense, at an institutional level, of what data is needed, but we need to determine where the priorities lie at the learner level and what power there is in that data.”
  • Define the data points for collection – Most institutions have policies and procedures according to government requirements, but few have actual definitions of data points according to their specific usage on campus. For example, at one institution, “student engagement” meant how many times a student logged into their Learning Management System (LMS). At another institution, this meant how many times a student physically showed up to the classroom. It is paramount that data points such as student engagement, retention, completion rates, and employability are well defined at the start of the initiative. Asking questions, brainstorming with others from different disciplines, and taking the time to define the specific data points that will be collected can benefit the collaborative development of clearly defined policies and processes moving forward.

Institutional Infrastructure

Many of the institutions we visited already had a data warehouse, but it housed limited integrated data from only one or two systems, for example, the Virtual Learning Environment (VLE), LMS, and Student Information System (SIS). All of the institutions still had many “manual processes” (e.g. collecting attendance manually on Excel spreadsheets) that were not being captured or housed in a collective place. Additionally, the institutions expressed an interest and a desire to have collective information housed in one place that was easily accessible (e.g. a “one source of truth” concept).

The following examples illustrate the types of institutional infrastructure challenges most often expressed at the institutions:

  • Data gaps and limited data availability – We were informed, at several institutions, that the readiness assessment process was the first time some staff had investigated the data the institution currently collects. In preparation for the onsite visits, many began to identify gaps in data or take notice of the limited amount of data available for use with learning analytics technology. One institution told us, “Not having certain data hasn’t been a problem in the past because it hasn’t been required for use.” This seemed to be a shared theme among the institutions visited. Another institution illustrated a specific example of missing or limited data related to library information, “We’ve noticed there are quite a few bits of data the library doesn’t capture. And the information it does collect often can’t be shared due to privacy policies.” These types of situations were common and certainly a point of frustration for many institutions.
  • Data ownership concerns – We found that several departments within an institution collect data, but they do not own the information itself. The institution, as a collective body, often officially owns the data, but does so without a clear gatekeeper for access to that data. This can pose the problem of who should own the data collected and how one gains access to the information for learning analytics purposes. One institution articulated to us, “There is no clear data ownership within the institution. We are unclear on who decides what data is used and collected and what isn’t. There is nothing systematic about the way we approach data.”

Institutions shared with us some ideas regarding potential solutions and recommendations that they feel would be beneficial:

  • “Single source of truth” – This can be both a technical and organizational solution. Institutions expressed that having one place to go for most, if not all, data collected would be the most effective way to mitigate the questions surrounding data collection processes and data ownership. One institution advised, “We only have ‘single points in time’ for data right now, like the VLE activity. What we need is a more dynamic view with more systems included and housed in one place to access it.” The solution could be a centralized data warehouse or Learning Records Warehouse (LRW) that is linked to several systems for data collection. For institutions, having one place to go to get the information needed saves time, energy, and effort. An individual at one institution said, “The University has never had a holistic viewpoint on what data is needed and collected over time. This would be hugely beneficial for everyone if we found a way to do this.”

Data Management

Individuals at the institutions collectively expressed their frustration regarding inconsistent data management practices and policies; we found that all institutions were using data, but for many different purposes. However, not all of the processes were in sync with each other. Most individuals within the institutions were not aware of which data other departments were using and for what purpose.

The institutions we visited were generally compliant with the UK’s Data Protection laws and policies; however, each department appeared to have their own interpretation of those laws and policies. They also expressed a desire to have a unified way of managing data that was implemented across the entire institution.

The following examples illustrate the types of data management challenges most often expressed at the institutions with whom we spoke:

  • “Lack of confidence” when it comes to data collection policies – Several institutions communicated a lack of confidence in the data that was currently being collected. This includes whether or not the data was of good quality for a learning analytics initiative purpose. There was also communicated concern regarding the lack of unified policies to help guide the collection of data itself. We heard several discussions around issues such as not having a universal policy for VLE/ LMS usage, attendance policies being unclear, and data not being collected uniformly across departments. Individuals within a large group at one institution had no awareness of what role each held within the university and how their job duties affected each other. They were unclear about how the data they needed impacted and overlapped with each role as well. One individual at that institution stated, “We have a vast array of ways we collect data at the moment and so many issues around that. We are seeing missing data, needs for data which is not currently gathered, attendance data that varies across programs and we have no idea if the data that actually is available is any good.”
  • Data protection and privacy policy confusion – An individual at one institution told us, “We have so many different policies and procedures for every department and program across the university right now. I have no idea how we are going to control that moving forward.” We found the same to be true across other institutions we visited. Most had policies and procedures regarding data protection and privacy, but they felt things were inconsistently enforced and poorly executed. Another individual voiced, “We need to be really clear about what data the institution is using and why we are using it. How in depth does it need to be? And what is the risk of doing it versus not doing it? We just have no idea right now.”

Institutions shared some ideas with us regarding potential solutions and recommendations that they feel would be beneficial:

  • “It is important to determine how data is captured and that it is standardized” – This is a direct quote from one member of an Information Technology (IT) group at an institution. This person advised that they discuss their frustrations regularly about this topic. They informed us that the bottom line for them was to have “controls about confidentiality levels, data that is captured and how it is used need to be determined upfront. This needs to be done the same way for everyone as well; not just IT.” Standardizing the practice of what data will be collected, who will gather it, and for what purpose it will be used in the initial stages of a learning analytics initiative will help outline the process for future iterations. This will allow for a smooth transfer of responsibility and help determine if any resources should leave the institution.
  • Establish a searchable “policy bank” – The individuals within one institution’s technical group shared that “having a searchable ‘policy bank’ would greatly cut down on the confusion about where a policy is and how to follow it.” This particular institution expressed that training new staff about the policies and procedures to be followed was difficult. This was due to the vast number of policies that existed and the fact that no one executed those policies in the same way. Another institution we visited had spent a significant amount of time and effort to develop a very clear outline for a policy regarding student consent. This included what “consent” meant to that institution, who can give consent to whom, and when consent is required. After all of this effort, there was still nowhere to refer to the policy for someone who was being newly on-boarded. As it was pointed out to us, “This is not an efficient, or even practical way, to house policies. If we can’t find it and it’s not readily available to everyone, how can we make sure we are adhering to it?” Establishing a centralized, searchable policy bank or storage hub significantly benefits institutions.

Throughout our visits, we found a common, prevalent misconception that an initiative such as learning analytics is either a technical or an organizational solution. It is both and needs to be addressed concurrently. Technical challenges are very real and prevalent; however, they only focus on those particular challenges and can lead to the fallacy that a learning analytics solution is a “silver bullet” or “magic cure.” Our message to institutions based on our experience is there is no “fix-all” solution that will resolve the entirety of systematic challenges an institution retains. Learning analytics technology is a tool designed to help provide a richer student experience. It is not meant to be the solution. It is meant to be a part of the solution.

Please be on the lookout for another article coming later this month regarding the quantifiable findings from the readiness assessments conducted and highlighted in the article, “Learning Analytics Adoption and Implementation Trends: Identifying Organizational and Technical Patterns.”

Here’s to continuous growth and improvement!

Useful Reading:

By Niall Sclater

Niall Sclater is Consultant and Director at Sclater Digital Ltd and is currently carrying out work for Jisc in Learning Analytics.

Leave a Reply

Your email address will not be published. Required fields are marked *