Categories
Institutional Use Trends

Learning Analytics Adoption and Implementation Trends: Quantitative Analysis of Organizational and Technical Trends

This is a guest post from Lindsay Pineda, Senior Implementation Consultant, Unicon.

  • What tool is used to quantify organizational and technical areas of readiness?
  • What can institutions learn from quantitative analysis of organizational and technical aspects of readiness?

For many institutions, the idea of beginning a learning analytics initiative may seem overwhelming and complex. There is often the perception that too much work needs to be done before any type of initiative can be explored. You might be surprised to learn that most institutions are already prepared for some type of learning analytics initiative. Whether your institution is ready to use student data and visualize it on a dashboard, or even pursue a small-scale pilot, the quantitative and qualitative analysis of organizational and technical trends supports an overall sense of institutional readiness for learning analytics.

The article “Learning Analytics Adoption and Implementation Trends: Identifying Organizational and Technical Patterns” outlined some of the key trends surrounding organizational and technical patterns obtained through Readiness Assessments conducted as part of Jisc’s learning analytics project1. Similar to the qualitative trends outlined previously, organizational and technical quantitative trends across institutions are evident. By sharing these quantitative trends in the aggregate, institutions interested in learning analytics may find the value in the quantitative analysis of organizational and technical readiness.

The Readiness Assessment process is designed to be collaborative and conducted onsite with a variety of key stakeholders across the institution. After the onsite visit, a comprehensive report is delivered back to the institution that contains all observations, direct feedback from participants (in an anonymous manner, collected as quotes without assignments of individual names), both qualitative and quantitative measures, and recommendations of next steps. The main goal of a Readiness Assessment is to gather key stakeholders to facilitate discussion, using activities designed to help the participants generate productive and collaborative conversations.

As introduced in the previous article, a matrix is used to measure readiness. The Unicon Readiness Assessment Matrix rates institutional readiness based on six criteria: Data Management/Security, Culture, Investment/Resources, Policies, Technical Infrastructure, and IR (institutional research) Involvement. These criteria provide an outline for the qualitative and quantitative elements discussed within the comprehensive report and throughout the onsite visits.

Quantitative Analysis

In order to gather information, we implemented a series of pre-visit electronic questionnaires, pre-visit phone calls, onsite visits, collaborative conversations with varying stakeholders, and post-visit communications. After collecting information from each institution, numeric values were assigned using a three-point Likert Scale based on a number of specific criteria. Then, once we had a robust sample of institutions, we created an aggregate for the sample.

Rating Scales

For each criteria, a rating of “one” indicated an institution was “not ready” due to the type and difficulty of the obstacles present. For example, when assessing organizational readiness, an institution may not have “high level buy-in for learning analytics” as evidenced by inefficient senior leadership support for the initiative, or, the leadership needs significant convincing of the initiative’s value.

When an institution had this type of rating, several recommendations were made to assist an institution with addressing the concerns for each of the areas outlined. An example of a recommendation made for an institution with a rating of “one” in the area of “leadership support” might be to set aside time with individual leaders to communicate the value of the initiative. Demonstrating potential retention benefits, understanding each leader’s goals for the institution, and involving the leader in communications and consultations moving forward may assist with overall understanding.

A rating of “two” indicated an institution was “somewhat ready” to move forward with a learning analytics initiative. When this type of rating occurred, it was often due to a combination of more difficult and less difficult obstacles present. For example, when assessing both organizational and technical readiness, an institution my not have “resource support available” as evidenced by a small number of staff possessing the skill sets needed; however, they may have a plan to address this by hiring outside resources for the required roles.

As with the rating of a “one,” several recommendations were made to assist the institution with moving forward. An example of a recommendation made for an institution with a rating of “two” in the area of “resourcing” might be to work with department leadership to establish the skills needed for each role. Involving stakeholders by giving them a forum to express their needs while communicating the benefits of added resources to the initiative could increase commitment for future efforts.

A rating of “three” indicated an institution was “ready” to proceed with a learning analytics initiative. When this type of rating occurred, it was often due to the institution demonstrating strengths such as having the needed resources to implement a learning analytics initiative, full senior leadership support, and a culture supporting the importance of data-driven decision making.

Recommendations for these institutions were made to help guide them toward a full-scale implementation. An example of a recommendation made for an institution with a rating of “three” in the area of “a culture supporting the importance of data-drive decision-making” might be to implement a small pilot. Choosing a department or school within an institution to implement a smaller pilot can lead to providing the evidence necessary to proceed with a full-scale implementation.

Quantitative Trends Summary

The specific criteria and scales used to quantitatively assess readiness are presented below. All scores are based on hypothetical institutional data.

Organizational readiness for implementation (cumulative average):
1 = not ready, 2 = somewhat ready, 3 = ready

Assessment Criteria Score
Demonstrates sufficient learning analytics knowledge 2.0
Level of change management comfort/willingness 1.0
Level of student engagement demonstrated/communicated 2.0
High level buy-in for learning analytics demonstrated/communicated 1.5
Organizational support demonstrated/communicated 1.5
Organizational infrastructure currently available 1.5
Policy/Practice management – Current processes 1.0
Obstacles/Challenges demonstrated/communicated 2.0
Ease of integration with existing organizational structure 1.5
Policy changes required 1.0
Level of understanding of ethics and privacy concerns 2.0
Level of enthusiasm demonstrated/communicated 2.0
Resource support available 1.5

Technical readiness for implementation (cumulative average):
1 = not ready, 2 = somewhat ready, 3 = ready

Assessment Criteria Score
Demonstrates sufficient learning analytics knowledge 2.0
Level of change management comfort/willingness 1.0
High level buy-in for learning analytics demonstrated/communicated 1.5
Institutional level of analytics – Current processes 2.0
Institutional infrastructure currently available 1.5
Data management knowledge/processes demonstrated/communicated 1.5
Data security knowledge demonstrated/communicated 1.5
Ease of integration with existing infrastructure 2.0
Maintenance support/feasibility (long term) 1.5
Resource support available 1.5
Obstacles/Challenges demonstrated/communicated 2.0
Level of enthusiasm demonstrated/communicated 2.0

The data from the hypothetical institution presented above demonstrates an overall rating of “somewhat ready” to implement a learning analytics initiative at their campus. What this indicates is that, while the institution may have obstacles related to areas such as change management, obstacles can be overcome with planning and collaborative effort. For example, this institution may be ready to pilot a small-scale technology solution, provided they address some of the other areas outlined including policy/practice management and data security.

Conclusion

Combined with qualitative analysis, quantifying readiness can be very effective in helping institutions assess how prepared they are for a learning analytics initiative. The quantitative analysis also assists institutions with moving toward a data-driven decision making culture. The Unicon Readiness Assessment Matrix not only provides both a quantitative and qualitative look at specific organizational and technical readiness criteria, but also presents an overall institutional readiness level for learning analytics.

Look for the next article in the related “Trends Unpacked” series to come in August 2017, where we will continue to discuss how the quantitative and qualitative data guides institutions in overcoming organizational-specific challenges.

Useful Reading

 

By Niall Sclater

Niall Sclater is Consultant and Director at Sclater Digital Ltd and is currently carrying out work for Jisc in Learning Analytics.

Leave a Reply

Your email address will not be published. Required fields are marked *