5 Ways the Learning Analytics Landscape Is Evolving

After combining the data from annual learning measurement surveys with L&D practitioner stories shared during Watershed Insights Summit, we’ve noticed a growing shift in the way both L&D professionals and the organization view learning analytics.

That is, L&D doesn’t just want to measure learning's impact, the broader organization is actually prioritizing it. In this post, we’ll discuss the meaning behind these trends and how L&D departments can continue to develop their measurement practices.

Measuring learning's impact on the business

During the past three years, Watershed and LEO Learning have conducted an annual measurement survey to consolidate the individual voices of thousands of market participants—including instructional designers, chief learning officers, and learning technology providers from all over the world.

We now have enough data to start spotting meaningful trends in how the opinions of L&D departments are changing over time. It can be difficult to change institutionalized beliefs about whether an activity—such as learning analytics—is worth the time, but changing this mindset is becoming more attainable, as you'll see below.

Read the in-depth third annual measurement survey report to dig deeper into the charts, statistics, and general takeaways.

1) Desire + Belief = Action

During the last three years, we’ve gone from 3 out of 4 respondents expressing some level of wanting to measure the business impact of learning programs (those who agree or strongly agree), to 19 out of 20 today.

Further, respondents feel more strongly now than ever before about the possibility to demonstrate learning’s impact.

People want to demonstrate the impact of learning, and they believe they can. This change is due in part to the innovative projects that organizations—such as Visa, Verizon, PwC, Applied, and other learning organizations—have undertaken and shared with the world via conferences, xAPI cohorts, and case studies.

These projects show a wide audience of learning professionals what’s possible in a modern, data-connected learning ecosystem that’s guided by a learning analytics strategy.

But this change also is driven by the increasing number of vendors in our space offering high-quality, native xAPI implementations—or at least accessible data exports that can be ingested as xAPI data.

In any case, what’s really important about these survey results is what they predict.

If you have a desire to do something and you believe it’s possible, there’s not much else standing in your way to take action.

These results reflect the current reality in which interest within the market is converting to action—whether they be experiments, pilots, or all-out enterprise deployments.

Of course, in an organizational context where you have many players and stakeholders, it may not be enough for you alone to have the desire and belief. In this case, demand from the organization is the other key ingredient.

2) Executive expectations continue to rise.

There's a jump from around a third in the initial results, to more than two-thirds today. That’s a massive majority change, and it’s the highest velocity trend across the entire survey.

Learning professionals are no longer the only audience paying attention to the changing tides of learning analytics. The executives—who are responsible for directing major initiatives and changes to keep their organizations alive and relevant—are taking notice as well.

As so many of us know, the attention of senior executives is often driven by need. And the growing demand suggested by these survey results indicates an increasing organizational awareness of the need for effective development of people in the workplace.

But how will the success of learning programs actually be measured?

3) There are different ways to measure learning’s success.

Though the results are more subtle here, there's a shift from non-evaluation and evaluation based on ROI into evaluation based on “organizational impact.”

Historically, organizations have attempted to directly measure the financial return of learning programs, which could ignore the important cause-effect relationships driving that return.

But we're seeing a healthy shift showing a more direct measure of learning in the form of its impact on operational metrics and internal KPIs, which should ultimately translate to the bottom line.

This shift in attitude agrees with what our own clients and other interested parties are telling us. They ultimately want to connect data from learning activities with the operational performance metrics of their learners.

For example, connecting these dots is a crucial part of understanding not only whether the recently launched video platform in your enterprise is getting engagement, but also if it’s actually making a difference in customer satisfaction scores or sales results.

Being able to connect these dots and quantify the impact of new initiatives or changes in approach is a powerful way to drive further investment in learning.

So what’s getting in the way of measuring the success of learning?

4) Actionable challenges create possibilities.

The perception of difficulty, cost to value, and lack of priority for measuring the impact of learning are decreasing.

There’s a fascinating shift away from institutional challenges to operational challenges.

In the initial survey, respondents cited main obstacles such as:

  • It’s too hard,
  • It’s too costly,
  • No demand, and
  • Competing priorities.

Now, they’re reporting their main obstacles are:

  • Don’t know where to start,
  • No access to data, and
  • Other (which we can assume to be more miscellaneous, detailed, operational challenges).

As the possibilities of learning analytics become clearer and demand increases, the perception of difficulty, cost to value, and lack of priority are all decreasing.

And they’re slowly being replaced with the sort of challenges people will identify when they are ready to take action, such as how to get started or how to start collecting data. These are much more tractable challenges!

5) Marry data with storytelling to initiate change.

Changing institutionalized beliefs about whether an activity is worth the time is difficult, but using data backed with a narrative that aligns with the business goals can help.

To change how learning analytics is viewed and used in your organization, start first within your team, and build out from there.

Need some L&D inspiration?

Stay tuned for our next post, as David Rosenfeld shares how he used data storytelling to gradually advance the measurement practices of his L&D colleagues.

Subscribe to our blog

Want to build a business case for learning analytics?

Use this free selection tool to find the best way to get started building a learning analytics program in your organization.

eLearning Learning

This website stores cookies on your computer to improve your experience and the services we provide. To learn more, see our Privacy Policy