The L&D maturity curve

Over the course of my career, I’ve witnessed a slow but steady shift away from formal learning to informal learning.

Of course, remnants of the “formal first” philosophy still exist, whereby every conceivable problem is attempted to be fixed by a training solution, typically in the form of a course. Over time, the traditional classroom-based delivery of such courses has increasingly given way to online modules, but that’s merely a change in format – not strategy.

While courses certainly have their place in the L&D portfolio, the forgetting curve places a question mark over their longterm effectiveness on their own.

The informal first philosophy balances the pendulum by empowering the employee to self-direct their learning in accordance with their personal needs.

While in some cases informal learning obviates the need for training, in other cases it will complement it. For example, I see the informalisation of learning as an opportunity to deliver the content (for example, via a wiki) which can be consumed at the discretion of the employee. The focus of the course then pivots to the application of the content, which is the point of learning it in the first place. Similarly, the assessment evaluates the learning in the context of real-world scenarios, which is what the learner will encounter post-course.

And since the content remains accessible, it can be used for ongoing reference long after the course has been completed.

A hand holding a pen pointing to a chart.

While I consider the informal first philosophy a giant leap in L&D maturity, it essentially pertains to instructional design. For a more holistic view of L&D, I propose an “assessment first” philosophy by which the capability of the target audience is analysed prior to any design work being undertaken.

The rationale for this philosophy is best appreciated in the context of an existing employee base (rather than greenhorn new starters). Such a group comprises adults who have a wide range of knowledge, skills and experiences. Not to mention they’ve probably been doing the job for a number of years.

Sheep dipping everyone in this group with the same training doesn’t make much sense. For a minority it might be a worthwhile learning experience, but for the majority it is likely to be redundant. This renders the training an ineffective waste of time, and an unnecessary burden on the L&D team.

By firstly assessing the target audience’s proficiency in the competencies that matter, a knowledge gap analysis can identify those in which the population is weak, and targeted training can be delivered in response. Individuals who are “not yet competent” in particular areas can be assigned personalised interventions.

This approach avoids the solution first trap. By focusing the L&D team’s attention on the real needs of the business, not only does the volume of demand reduce, but the work becomes more relevant.

The assessment first philosophy may appear incongruent where new starters are concerned, who by definition are assumed to be weak in all competencies – after all, they’ve only just walked through the door! – but I counter that assumption on two fronts.

Firstly, not all new starters are doe-eyed college grads. Many have had previous jobs in the industry or in other industries, and so they arrive armed with transferable knowledge, skills and experiences.

And regardless, the informal first philosophy holds true. That is to say, the new starter can consume the content (or not) as they see fit, demonstrate their understanding in the scenario-oriented “course”, and formalise it via the assessment.

The results of the assessment dictate any further intervention that is necessary.

Of course, some topics such as the company’s own products or processes will necessitate significant front-end loading via content development and maybe even curricula, but these may be considered the exception rather than the rule. By looking through the lens of assessment first, the L&D team works backwards to focus that kind of energy on where it is warranted.

It is also worth noting the assessment first philosophy renders the traditional “pass mark” obsolete, but such a radical idea is a story for another day!

Laptop showing business metrics.

While the assessment first philosophy represents an exponential leap in the maturity of L&D, there is yet another leap to make: “performance first”.

The raison d’être of the L&D team is to improve performance, so it’s always been a mystery to me as to why our work is so often disconnected to the business results. I do appreciate the barriers that are in our way – such as the inexplicable difficulty of obtaining the stats – but still, we can and should be doing more.

Under the performance first paradigm, it is not knowledge gaps that are analysed, but rather performance gaps. A root cause analysis identifies whether the cause is a capability deficiency or not – in the case of the former, a capability analysis feeds into the assessment first approach; in the case of the latter, a solution other than training is pursued instead.

As with assessment first, performance first may appear incongruent where new starters are concerned. After all, their stats thus far are zero, and waiting to recognise poor performance may have unacceptable consequences.

So again we have another exception to the rule whereby some folks may be scaffolded through L&D intervention prior to their performance being analysed. However the point is, we needn’t force everyone down that road. It depends on the circumstances.

And again, by looking through the lens of performance first, the L&D team works backwards to focus its energy on where it is needed. But this time with results at the forefront of the team’s purpose, its relevance to the business goes through the roof.

The L&D Maturity Curve, featuring Formal First rising to Informal First rising to Assessment First rising to Performance First. The x-axis represents maturity of the L&D function and the y-axis represents its relevance to the business.

I realise my take on L&D maturity might freak some of my peers out. Concurrently, others will argue that we should leapfrog to performance first now and get on with it.

Personally I consider the maturity curve a journey. Yes, it is theoretically possible to skip stages, but I feel that would be a shock to the system. From a change management perspective, I believe an organisation at one stage of the curve would achieve more success by growing into the next stage of the curve, while ironing out the bugs and creating the new normal along the way.

Besides, it isn’t a race. Important journeys take time. What matters is the direction in which that journey is heading.

11 thoughts on “The L&D maturity curve

  1. I think you’re right Ryan in putting performance and assessment at the front, but I’m not sure that it’s a development/maturity progression in the order you’ve outlined. From an instructional design pov, even if you just run with the ADDIE model, it all starts with analysis. I agree that we haven’t done this all that well, and that good analysis has performance at its heart, formative assessment as its tool, and the evaluation criteria of the model as its guide. But it’s still been there, and for a long time.
    The shift from formal to informal doesn’t reflect a change in the model (both still start with good analysis, which will determine which learning is most appropriate (if any, using the knowledge, skills, motivation, environment framework)), but more an acknowledgement of the value of alternatives to formal learning (something that’s been both underestimated and long-overdue in L&D).
    I think that the maturity is not in progressing through to an informal first, assessment first, performance first position, but in holding all those ideas and working out what best suits the learners, the context, the content. There is no one way (or first).

  2. Certainly Neil, I think the ADDIE model holds true (if that’s what you want to use). I suppose the point in relation to the “Analysis” phase is: what are you analysing?

  3. Therein lies the rub! Even our language is unhelpful here with many referring to this stage as Training Needs Analysis, presuming the answer is training. For me, good analysis is broad – I want to analyse the business and any identified needs. I want to analyse what they think the performance outcomes would be that would address those needs. I want to analyse what’s already in place in terms of learning and resources (right across 70:20:10). I want to analyse ethnographic detail. I want to analyse any existing metrics and data that relate to all of these!

  4. Interesting article Ryan. I have been trying to foster a learning culture within my organisation and often struggle to encourage staff to engage with learning beyond formal training. Inertia, lack of curiosity, and an inability/unwillingness to collaborate and see this as learning has been a challenge in trying to engender the informal first philosophy.

    I have been pondering for sometime whether it is possible to leapfrog from a formal first to performance first philosophy and what resources would be needed to do this. Yes, I think it would be a shock to the system as you point out, but I am wondering whether the evolution would be too drawn out otherwise.

    I appreciate your conclusion that it is an important journey and that it takes time. My organisation’s journey has encountered wrong turns, engine stalls and meandering side-trips. But (to punish this metaphor even more!) we now have a driver and destination, so if I can get the map right, we will hopefully get back on course.

  5. Keep going, Elizabeth. You may be right, leapfrogging may be the way to go after all. And those hiccups you mention are universal!

Leave a comment

This site uses Akismet to reduce spam. Learn how your comment data is processed.