Return on investment, applied to learning, can be a tricky concept. Theoretically, we know how to do it, but all too often, we just…don’t. Which is really too bad, because how can you know if the learning deliverables you’re creating are effective, if there’s no system in place for measurement?

One common means of measurement in the learning and development field is Kirkpatrick’s Four Levels of Training Evaluation. The four levels correspond to varying degrees of evaluation:

  • Level 1 – Reaction

  • Level 2 – Learning

  • Level 3 – Behavior

  • Level 4 – Results

The four levels of evaluation

All too often evaluation stops at Level 1, Reaction, with the ubiquitous “happy sheet.” Did you like the training? Check. Did you learn something useful? Check. Will you use what you learned on the job? Check. You can integrate Level 1 evaluation into your video learning fairly simply in the form, say, of a Facebook-style “like” button at the end of your video, which will provide you with a baseline indication of its impact. Generally positive reactions mean you’re headed in the right direction, and generally negative reactions probably mean you’ve missed the mark and will need to reevaluate. While Level 1 can provide you with useful insights, keep in mind that the impressions you glean are largely based on learners’ perceptions of your training video rather than a real measure of lessons learned or skills acquired.

Level 2, Learning, is a measure of what knowledge or skills have been obtained directly as a result of viewing the video. Reverse engineering can be helpful with this level of evaluation. At Obsidian Learning, we’re big fans of establishing clear learning objectives from the get-go no matter what form the learning deliverable takes, and video is no exception. If you’ve performed a needs analysis, formulated your learning objectives, and effectively translated those learning objectives in your video, you have something against which to measure.

Let’s say, for example, that your aim is to teach learners about a new piece of software. Your learning objective might be something along the lines of, “By the end of this video, you will be able to use XYZ software to [perform a stated task].” Shortly after the training video (and presumably, the software) goes live, you can start measuring whether your employees have indeed started using the software as intended. If so, your return on investment has probably been met or exceeded. Video has the added advantage of being suitable for performance support, so if any of your employees haven’t quite mastered the new software, they can go back and view the bits that directly impact their work. Being able to use a learning asset repeatedly is another positive gain in terms of ROI.

Level 3, Behavior, takes a look at, you guessed it, actual behavior change. Let’s go back to our software example. You invested in a training video, your employees have been exposed to it and learned how to function in the new software, but the expected behavior change has not occurred. The ROI of your video has taken a serious hit. Why? Obviously this is not the hoped-for result, but it is a good opportunity to go back and examine your messaging. Did you explain why the software is necessary? Did you speak the language of your target audience? Did you let them know what’s in it for them? In learning and development, these are basic questions that should be addressed before a single frame has been shot; if the expected behavior change/skills acquisition has not been effective, it’s a good bet that these questions have been neglected.

Level 4, Results, is the big-picture evaluation of your video. Once more taking up our software example, let’s say that the company’s stated objective was to have 50% of all employees correctly using the new software within three months of the training event. Again, this type of objective should be set before training asset development. If the organization exceeds that goal, the training video is an unqualified success, and you’re looking at positive ROI. If not, you’ll need to do some analysis to understand why the video did not achieve or exceed its intended metric.

Some variations of Kirkpatrick’s Levels of Evaluation include a fifth level, ROI Determination. This is a mathematical calculation that looks at whether your overall costs (development, rollout, evaluation) exceed the benefit obtained from the training. It can be challenging to perform this evaluation, because it requires placing a dollar value on results that it can be difficult to characterize in that way. In our software training example, you can develop a method to translate the increased percentage of users into a calculated cost savings for the company. If that savings exceeds the amount spent to create your training video, your company is in positive ROI territory. But sometimes training gains are intangible; it’s much more difficult to monetize soft skills such as communication and problem solving, for example. Which is not to say you shouldn’t make the attempt. Some evaluation is better than none, and even a broad strokes analysis will help you determine if you’re making the kinds of gains that demonstrate that your training videos are worth the investment.

The use of video in learning is not new. As we’ve demonstrated throughout this eBook, it can be used for a variety of purposes, ranging from company messaging, technical skills training, and performance support. Like all learning initiatives, however, the benefits must demonstrably be worth the costs. Taking the time to apply Kirkpatrick’s Four (Five) Levels of Evaluation will help you figure out if you deserve a pat on the back or if it’s back to the drawing board. Either way, examining the ROI of your corporate training videos will help ensure that your learners are getting the information/skills they need, and the organization is seeing the benefits, tangible or not.

Share Article

You Might Also Enjoy:

How To Train Remote Employees In A Virtual Learning Environment
How To Train Remote Employees In A Virtual Learning Environment
Through the prism of the 10 Principles of Customer Strategy, this article examines the question of how to train remote employees in a virtual learning environment.
How To Write Effective Learning Objectives to Support Your Blended Learning Strategy
How To Write Effective Learning Objectives to Support Your Blended Learning Strategy
Writing strong learning objectives is key to the success of your blended learning program. They serve as the basis for evaluating the success of your learning offering. Here is a simple 4-step guide to writing effective objectives to best support...
Evaluating Training – Capturing the Benefits Aspect of ROI
Evaluating Training – Capturing the Benefits Aspect of ROI
Training evaluation is necessary and, in many ways, critical to the success of a business. But because short term priorities always seem to take precedence, it is typically something we plan to do better in the next course, or maybe...
Measuring The Effectiveness of Your Blended Learning Program
Measuring The Effectiveness of Your Blended Learning Program
Effectively evaluating your blended learning program is a crucial part of the learning process. It helps us justify the cost and demonstrate added value while giving us much-needed information to improve for next time. Evaluation is not easy, however. Read...