Learning Analytics: Data Evaluation Complexity

Data evaluation is by far and away the most common complexity level of learning analytics we observed in our learning analytics research study. And within this complexity, we found 16 types of analysis represented, which we’ve bundled into three headings and will explore in this blog post.

Analysis Types for Data Evaluation

In our last blog post, we outlined the four complexities of learning analytics—including data evaluation, which asks whether what's happening is good or bad.

At this complexity level, we’re applying high-school level math—such as averages, means, modes, and basic statistics—to aggregate the data and establish benchmarks. Typically, most analytics fall into the basic data evaluation category, but still offer tremendous value and opportunities for some big wins.

Here are the analysis types we found in the this complexity:

1) Frequency and Completion

Frequency and completion analysis types look at how often things happen and if things have happened. This includes:

  • Attendance Analysis: How many people showed up? Are sessions and training resources well utilized?
  • Certification Analysis: How many people are certified? Have people obtained their required certifications?
  • Completion Analysis: How many people completed a course, video, quiz, etc.? Have people completed what’s required?
  • Utilization Analysis: How often is a learning resource used? What are the most-used resources?
  • Engagement Analysis: What level of engagement does a resource have (e.g. comments, likes, shares)? What’s engaged with the most?
  • Frequency Analysis: How often does something happen? This is a catchall for anything that doesn’t fall into one of the analysis types covered above.

For example, the following report shows frequency analysis of mistakes during a mock code blue simulation.

This report shows that three steps of the Mock Code Blue see significantly more mistakes than the others. This is a good indication that more training may be needed in these areas.

2) Outcome

Outcome analysis looks at the outcome of things, such as how well people perform or how a learning experience is rated. This includes:

  • Before-and-After Analysis: What changed before and after an event (e.g. assessment scores, job performance, etc.)?
  • Learner Ratings Analysis: How did learners rate a learning experience?
  • Confidence Analysis: How confident are people in their answers?
  • Time Spent Analysis: How much time is spent learning?
  • Progress Analysis: How far are people through an event or experience?
  • Question Analysis: How well are people answering questions?
  • Cost Analysis: What's the cost to provide the event or experience?
  • KPI Analysis: Where do KPI values stand?

The following confidence analysis example shows a comparison of confidence against correctness.

This kind of report could be used to identify people above the line who are more correct than they are confident, and people below the line who are more confident than they are correct. Confident but wrong people can be dangerous because they will act on their incorrect knowledge.

3) Others

A couple of analysis types that don’t fit in the first two collections are:

For example, the following report shows use of search by device over time:

The majority of search is happening on desktop, and there was a significant increase in both desktop and mobile search in the spring of 2018. This may be a useful prompt to dig deeper into why activity increased so dramatically or to proactively evaluate if existing content is accessible and useful on mobile devices.

What Does Data Evaluation Look Like in Practice?

While there’s a wide variety of analysis types, our research study revealed that the majority of report views within the data evaluation complexity relates to completion analysis or utilization analysis. (This is a broadly similar picture to categories, where most views are for program and experience reports.)

A lot of the completion analysis reports relate to program analytics with some relating to experience analytics. And many of the utilization analysis reports relate to experience analytics with some relating to learner analytics.

As we discussed in our last post, organizations are generally looking to get the basics down first, before moving onto more complex analytics around impact and effectiveness.

And these basics mean reporting that people have completed the things they are supposed to have completed, and reporting on the usage of learning resources and platforms.

In the breadth of different analysis types, we do see some organizations moving beyond these basics, but that’s not quite yet the norm.

Actionable Insights

Even within data evaluation, there’s a lot you can do beyond completion and utilization, but few organizations are doing so. Think about how you can get more data about engagement and other areas of analysis for richer insights. Use the worksheet below as a starting off point.

Up Next: The ‘Advanced Evaluation’ Complexity

In our next learning analytics blog post, we'll explore some examples of the next level of complexity: Advanced Evaluation and its five analysis types.

Subscribe to our blog

5 Ways Learning Analytics Can Transform Your Business

Learning analytics has a reputation for being complex and challenging. But what if you ignore the technicalities and look at the stories that data can tell? In this talk, we look at five ways global enterprises used their learning analytics platform to improve, adapt and transform areas of their business in very different circumstances.

eLearning Learning

This website stores cookies on your computer to improve your experience and the services we provide. To learn more, see our Privacy Policy