Dead-Simple ‘Ninja’ Trick For Justifying Your ELearning Designs

By: Justin Ferriman September 24, 2015
Filed Under:

I have come to a realization during my elearning consulting engagements: not everyone enjoys elearning but everyone has an opinion on it.

If you are an instructional designer then you probably have seen this before yourself. In most cases it involves a subject matter expert (SME) refusing to set aside some time with you to discuss course content.

When they finally do make the time, they spend it telling you why the course design is wrong and how it could be better.

I’ve had this issue several times.

I can recall one SME that would refuse to meet with me (he kept pushing out the scheduled meeting). When we finally met, he opened up an elearning course that they had worked on over 10 years ago(!) in an effort to show me how the course I was working on should be done.

This is more of an extreme example and not something you’re likely to encounter too often. However, if you haven’t already you will find that people will try to tell you how to do your job and that’s not very fun.

It’s important that you don’t let them steamroll you but to do so in a way that is respectful.

Just because someone drives a car doesn’t make them a race car driver. In the same respect, just because someone has taken an online course or two doesn’t make them an elearning developer.

Remember, this is your area of expertise!

Justifying Made Easy

So how do you justify the decisions you make in your course design? Well, there is one fool-proof way to both establish yourself as an expert and put an end to the constant suggestions. It’s a “ninja trick” I have used in the past with 100% success rate.

It’s simple: back-up your decisions with results-oriented data.

I’m not talking about sharing vague “national averages” or anything like that. I mean sharing actual success metrics from other elearning courses that you have created.

This takes a bit of preparation on your part. Prior to having any meetings withe SMEs you should go through the data from previous elearning engagements and pull out the “wins” that are the same (or similar) to the ones that you are currently working toward.

Have these memorized so that you can leverage them when necessary.

The next time you’re in a meeting and someone starts to tell you that your elearning design is flawed, you can say something along the lines of:

“Thanks for the feedback. You should know though that this was done quite intentionally. One of the goals for this course is to accomplish ‘ABC’. This is similar to another project I worked on and when we created the course in this way it resulted in not only meeting the desired goal of ‘ABC’, but also increased XYZ by 37%”

Conversation over.

Of course, they might go on to tell you why this situations is different but you can reaffirm that you have already done “X” months of analysis of the stakeholder group and the similarities are undeniable. 🙂

While this little tip is nearly foolproof, it’s important that you still remain open to constructive criticism. Also, nobody likes a “know-it-all”, so use this technique sparingly and only when absolutely necessary. You’ll find it is a great way to keep your more difficult meetings on point.

Justin Ferriman

Justin started LearnDash, the WordPress LMS trusted by Fortune 500 companies, major universities, training organizations, and entrepreneurs worldwide. He is currently founder & CEO of GapScout. Justin’s Homepage | GapScout | Twitter