October 21, 2017
What Do You Do With Your Evaluation data?
Comments
(5)
October 21, 2017
What Do You Do With Your Evaluation data?
I've been an eLearning designer and developer since 2005. In 2015 I started my own eLearning design company. I began creating Adobe Captivate video tutorials to help promote my business through my YouTube channel at https://youtube.com/captivateteacher. My intention with my YouTube videos was to attract attention from organizations looking for a skilled Captivate developer. This strategy proved successful as I've worked with clients worldwide, helping them build highly engaging eLearning solutions. In addition, my YouTube channel presented another benefit of attracting aspiring Captivate developers to seek me out as a teacher. I now offer online and onsite training on Adobe Captivate, teaching users the skills to build engaging and interactive learning.
Legend 639 posts
Followers: 910 people
(5)

Donald Kirkpatrick created the four-level model for training evaluation, which most organisations claim to cherish. For those unfamiliar, the four levels are as follows.

  1. Reaction – this answers the question what did the learners think about the training. We measure reaction through surveys conducted towards the end of training (sometimes called smile sheets)
  2. Learning – this answers the question what did the learners learn during or immediately following the instruction. We measure learning most often through a quiz or a skills demonstration
  3. Behaviour – this answers the question did the learners implement their new knowledge or skills back on the job
  4. Results – this answers the question what impact did the training have on the organisation. We measure results most often with financial reports. However, results can also be things like customer satisfaction.

In my last full-time job (before I became a freelance designer/developer), the facilitator or designer/developer would review his or her level 1 evaluations and retain this data for their semi-annual review. Occasionally the team manager would look at them, but more often than not, the team administrator would stuff them in a file cabinet, never to be seen again.

As a designer, I would look at the odd results from our level 2 evaluation reports. Unfortunately, our LMS wasn’t sophisticated enough to tell me which questions were proving to be difficult for my students. Had I known those types of results, I would have looked more closely at first the course content that would affect those problem questions and secondly I would review the question itself. I would ask myself was it written in such a way that could make it difficult for students to answer correctly?

I’m afraid to say that in my previous organisation we didn’t perform any level 3 or level 4 evaluations at all. There just was no demand for this information and very little time to conduct the research needed to get these results. Instead, our executive was more concerned about completion reports.

When I started working alongside Adobe, they granted me a complimentary license for Adobe Captivate Prime for a period. I was impressed with the simple yet effective level 3 evaluation tools built into the LMS. Each time an employee completes online training from Adobe Captivate Prime, the employee’s manager will receive a notification at a later time asking them to evaluate the on the job performance. Level 1 and 2 evaluations are great but what matters are behaviour and results. If you can combine the level 3 results provided from this LMS along with your company’s financial reports, you could say without too much uncertainty if your company’s learning strategy is effective.

Shortly after I trialled Adobe Captivate Prime I created the following video. It’s a couple years old now, but I think it’s still an accurate assessment of Adobe’s LMS product and how effective your learning can be.

5 Comments
2019-09-05 19:46:20
2019-09-05 19:46:20

Thanks for providing some insight into this valuable feature of Captivate Prime.

Like
(1)
(1)
>
ToyinJ.A.
's comment
2019-09-05 21:17:34
2019-09-05 21:17:34
>
ToyinJ.A.
's comment

You’re welcome.

Like
2019-04-26 13:03:11
2019-04-26 13:03:11

I do it through a reflection at the end of the course that asks the right questions. I make sure I read the evaluations too and respond accordingly. This takes time, but if it is not done what is the point in having a reflection except to give the impression that so-called ‘quality assurance’ and ‘best practice’ are being performed.

Levels 3 & 4 require delayed reflection. I encourage participants to add what they intend to implement to their PDP and then check later e.g. six months (or whatever is appropriate).

Like
(1)
2018-05-20 04:50:48
2018-05-20 04:50:48

That is a great idea of keeping the data for a semiannual or annual review.

Like
(3)
2018-03-19 14:27:32
2018-03-19 14:27:32

Yes, if an automatic, level three feedback can be encouraged and expected, wow can that be a great asset to trainers and developers. Thank you for sharing the thoughts Paul.

Like
(2)
Add Comment