Skip to content
Person staring over a horizon

What type of innovation is the Learning Experience Platform?

Greg Satell, writing in the Harvard Business Review, identified 4 distinctly different types of innovation. So where does an innovation like the Learning Experience Platform (LXP) sit within his matrix?

In Learning Pool’s recent white paper, Powering the Modern Learning Experience, our Chief Product Officer, Ben Betts, and John Helmer FLPI touch on the question of what type of innovation an LXP represents? In the course of their analysis, they cited an article in the Harvard Business Review by Greg Satell, The 4 Types of Innovation and the Problems They Solve. However, whilst arguing that an LXP represents an evolution rather than a revolution in learning systems, they didn’t place it definitively in any of the four boxes on Satell’s matrix (reproduced below).

Do LXPs fit neatly into any of these categories and does it really matter which? Well, the way we conceive of innovation often influences the use we make of it and we want LXPs to be useful, valuable and rewarding for both the learner and their place of work. So, whilst tightening the definition around what type of innovation an LXP represents isn’t going to change the world, it might helpfully inform our choices of how it could and should be used.

Satell divides innovations into four different categories, based on how well or badly the problem the innovation addresses and the skills domain needed to solve it are defined. 

This can sound a little difficult to understand, but the example he gives for breakthrough innovation helps to make it clearer. A group of microchip designers was tasked with designing a sensor to detect pollutants at very small concentrations in water. The marine biologist who had been assigned to the team was able to tell them that clams are able to detect pollutants at a few parts per million, at which point their shells open. This radically simplified the problem: all the team had to do now was build a sensor that could detect when clams opened their shells – much cheaper and simpler to do than trying to detect pollutants directly.

Satell gives this as an example of breakthrough innovation within his definition because while the problem was well defined, the domain knowledge needed to solve it did not come from an obvious quarter. It was the presence of a marine biologist on the team, with his “off-topic” knowledge of undersea creatures and their behavior, that led to the solution – not any item of domain knowledge possessed by the chip designers.

In the Learning Pool white paper above, Ben and John didn’t come to any firm decision on innovation categorization but suggested that the LXP is most likely a Sustaining Innovation – i.e. one where both the problem that needs to be solved and the domain knowledge that is needed to solve it are well defined. However, deeper reflection on Satell’s matrix might lead us to a different conclusion.

What is the problem that LXPs solve?

It is certainly true that the problem LXPs were created to solve was well defined; a problem that has its roots in the dissatisfaction experienced by many learning professionals with the limitations of the classic LMS model. The whitepaper discusses in some detail the drivers of that dissatisfaction, which can be broadly summarised as:

  • Changes in the pattern of working – e.g. gig economy, contingent workforces
  • Changes in the needs and expectations of the workforce – as summarised in Bersin’s definition of the Modern Learner
  • Changes in the technology landscape and the user behaviors they have given rise to – e.g. multifunctional device use, ubiquitous internet connection, social media, personalization)
  • The collapse of instructional design orthodoxies – with the emergence of new ideas about learning theory such as 70:20:10, informal and social learning, etc.

This combination of forces led to many modifications to systems originally designed according to the classic paradigm of the LMS, as new features were added year on year. Eventually, however, the sheer accumulation of such fixes led to ‘features-bloat’, bringing the underlying paradigm itself under extreme pressure. A paradigm shift was necessary in order that progress in the learning systems could keep pace with the demands of the market.

The problem, then, was well defined. But from where would the domain knowledge come that would solve that problem?

The content conundrum

Answering this question satisfactorily involves looking a little more deeply at the problem itself.

Learning is a rich and complex thing, so it follows that the problems generated within organizational learning are similarly rich and complex. It would be a mistake to caricature the problem an LXP solves as merely one of content delivery. So much else is bound up in the process of learning beyond the content itself. 

However, content delivery does form a substantial portion of the activities carried out by the average L&D team – and is, moreover, something at which the classic LMS was often seen to be spectacularly failing. So it is useful for our purposes here to ask why content delivery had become such a big part of the problem for the old paradigm.

Looking at the list of forces above that put pressure on the LMS we see a variety of drivers that had impacted the world of learning content, leading to a situation of much greater content diversity.

  • Learning design had moved beyond the course as the default unit of instruction, embracing a huge variety of types of content, both more and less immersive; video, gamified learning, checklists, infographics, etc.
  • Widespread access to search engines and always-on internet and social media platforms via personal devices had put the acquisition of learning content on a completely different footing; the consumer experience was now driving workplace expectations
  • Integration of cloud-based platforms through APIs had accustomed learners to a free-flowing browsing experience across multiple sites that undermined the idea of one monolithic learning platform that could meet all needs

The result was a content landscape more diverse in access points, content types, content authorship and ownership (including user-generated content) length and interactivity. 

Meanwhile the LMS had, baked into its structure, a standard for content distribution and tracking, SCORM (standing for Shareable Content Object Reference Model) designed around a much narrower, and now as it turned out hopelessly outdated, set of assumptions about what learning content might be and how it might behave. No wonder the classic delivery model was creaking! 

So now we come to address the crunch question in deciding what sort of innovation LXP represents according to the Satell model: from where did the domain knowledge come that solved this problem?

If the LXP had resulted solely from straight-line development of the LMS model along paths laid out by corporate roadmaps and using domain knowledge from within the field of learning technologies, then we could safely put it with the box in the Satell matrix labelled Sustaining Innovation. But that is not what happened.

There is a phrase from the marketing literature used to promote LXPs which has been widely contested and overused to the point of irritation – ‘the Netflix of learning’ – which nevertheless gives a clue to at least one of the sources of external domain knowledge that were used instead.

It is probably no coincidence that LXPs began to spring up at around the same time that L&D professionals, worried by the problem of low learner engagement, were beginning to reach out to related domains such as consumer marketing theory to see how techniques of engagement could be harnessed in the service of learning. If learning was to be more learner-directed, and less about command-and-control, then some means of creating desire and appetite for learning content had to be found. The domains of consumer media and marketing were obvious places to look for these means.

Early examples of LXP’s channelled social media to bring YouTube-style video sharing to their new platforms, as well as Trip Advisor-style rating systems and other social features like sharing, liking and commenting. Personalization, in the form of AI-driven recommendations, was a feature of media platforms like Netflix and Spotify, as well as online traders like Amazon. Artificial Intelligence itself was a knowledge domain almost completely alien to learning technologies before the advent of LXPs (barring some notable exceptions in adaptive systems from providers such as Knewton and CogBooks).

With a well-defined problem and solutions drawing on external knowledge domains not well defined at the time the problem was first encountered, I’d suggest it, therefore, seems logical to give LXP the status of a Breakthrough Innovation under the Satell model.

It seems fitting to describe this development as a breakthrough where learning has reached beyond its traditional bounds to external knowledge domains. Learning can too easily feel like something kept apart from the everyday working lives of those who stand most to benefit from it: the learners. In making these breakthrough connections to fields like marketing, media and AI, LXP’s surely stand a chance of bringing learning closer to the people who most benefit, growing a learning culture and, ultimately, weaving learning more closely into the organizational fabric.

I believe the LXP was and remains a true Breakthrough Innovation

But what do you think?

Read the white paper now

learning and development
LXP

Got a learning problem to solve?

Get in touch to discover how we can help

CTA background