Learning Analytics: Advanced Evaluation Complexity

Advanced evaluation is less common than data evaluation, which means there’s no well-worn path leading the way. Organizations venturing into this level of complexity often do so in ways that are different from others. We see this in our Learning Analytics Research Study data, where no analysis type has significantly more report views than the rest.

That’s why we’ve divided the advanced evaluation complexity into five analysis types, which we’ll explore in this post, along with examples of what organizations are doing in terms of this complexity.

What Is Advanced Evaluation?

As a quick reminder, advanced evaluation looks at things—such as correlations and regression analysis—and applies statistical techniques to understand what happened and why it happened. This type of evaluation also creates theories about causation, which allows you to focus on what’s working best while scrapping ineffective learning.

In other words, advanced evaluation asks: Why is this happening?

Advanced Evaluation & Analysis Types

Now, it's time to learn more about the five analysis types we identified that fall under the advanced evaluation complexity:

  1. Chain of Evidence
  2. Drop Off
  3. Segment
  4. Workflow
  5. Qualitative Responses

Our Learning Analytics Research Study data shows there isn’t one analysis type with significantly more report views than the rest regarding advanced evaluation.

1) Chain of Evidence

The term chain of evidence relates to the idea of showing the training's impact on business performance by tracking evidence of the chain of events—ranging from the learning experience and knowledge gained to improved performance and business impact.

This process is loosely based on Kirkpatrick’s four levels of learning evaluation.

One type of advanced evaluation is to pick two links in this chain (as illustrated above) and evaluate the extent to which they are related. This analysis looks to validate the logic of the chain envisaged by the learning design and the effectiveness of the learning strategy in practice.

For instance, this example correlation report from Watershed shows the relationship between an assessment score (a measure of learning) against a customer satisfaction rating (a business KPI).

2) Drop-Off Analysis

Drop-off analysis looks at where people are exiting a particular process. For example, this might mean looking at the following:

  • how far through people watch a video
  • the slides where people drop out of an e-learning course
  • how far people get through a MOOC before they disengage

The following example shows a drop-off analysis for an xAPI-tracked game we hosted during a conference. Perhaps, confusingly, there’s negative dropoff from launching the game through starting an attempt—which was because:

  • not everyone used the launcher, and
  • a single player could start multiple games from the same launch.

We then see a massive drop off between people starting and finishing the game.

This insight helped reinforce that, while the game might be great in a work context, people just didn’t have time and interest to finish several rounds of gameplay at a large-scale conference. (And they didn’t need to play every round—as a few rounds were enough to show off the game).

3) Segment Analysis

Segment analysis involves identifying a specific group of people and then selecting that group for further analysis. For instance, you might want to:

  • know what learning activities are most popular amongst your top salespeople, or
  • compare average scores for people who generally use mobile versus those who favor desktop.

The following example shows a scatter plot, which is one way to identify a segment. The highlighted yellow area identifies managers with high point-of-sale gross profit (POS GP) percentages and low chargeback (i.e., rebates) percentages.

This group also can be filtered in other reports for further analysis. For instance, you can see how people with high POS GP percentages and low chargeback percentages performed on an assessment compared to people with low POS GP percentages and high chargeback percentages to assess the assessment's effectiveness at predicting KPI values.

4) Workflow Analysis

Workflow analysis looks at how learners find or access learning resources. For instance, did they come from a search, recommendation, or homepage link? This kind of analysis can help determine the best way to promote new or featured content while using the general information architecture of your platform.

The following example looks at the number of times people launched different items from a particular panel on a platform. The 4-Hour Workweek is clearly the most-clicked recommended item.

So, what's special about it? Perhaps it's the one in the top left, has a catchy image, or everybody just wants to know about four-hour workweeks. But with some further digging, you can use this information to improve the clickability of future recommendations.

5) Qualitative Responses

Qualitative survey responses can help you understand the reasons behind your quantitative data. Data makes much more sense when you know the context, and qualitative information provides that context. Don’t overlook it!

For example, if everybody fails question 37, and the feedback says there's a bug with question 37 (i.e., all the options are the same), you know why everyone failed that question. You also can use it to explore significant successes and failures further, as directed by Brinkerhoff's Success Case Method.

Actionable Insights

Currently, there’s no clear path as you move into the advanced evaluation phase of learning analytics; you just have to make one. Either pick the most relevant analysis type we’ve listed in this post and implement it in your organization or explore other ways to understand why things are happening.

Up Next: The Predictive and Prescriptive Complexity

Next week, we reach the dizzy heights of the “Predictive and Prescriptive” complexity. So hold on to your hats; it gets windy up there!

Subscribe to our blog

5 Ways Learning Analytics Can Transform Your Business

Learning analytics has a reputation for being complex and challenging. But what if you ignore the technicalities and look at the stories that data can tell? In this talk, we look at five ways global enterprises used their learning analytics platform to improve, adapt and transform areas of their business in very different circumstances.

eLearning Learning

This website stores cookies on your computer to improve your experience and the services we provide. To learn more, see our Privacy Policy