When you’re nearing the end of development for an eLearning course, it’s easy to want to rush through the final review and quality assurance (QA) cycles in an effort to get it published ASAP. Trust me, I’ve been there more times than I can count! However, if I’ve learned anything during my time developing eLearning content, it’s that rushing usually results in sloppy work. This is where user acceptance testing for eLearning can help.
Although I’ve previously talked about the importance of conducting a thorough QA testing process, it doesn’t always catch every issue you might be trying to identify and fix. Sometimes, it’s your end-users (the learners) who often spot a lot of your mistakes. What you don’t want is your learners finding all of these errors after you’ve published your course!
Here are five tips for how to conduct user acceptance testing for eLearning.
Gather a Diverse Mix of Testers
The first step in conducting user acceptance testing for eLearning is to gather a diverse mix of testers. Although you might have a particular target audience for your course, usually geared towards a specific job function, there should be some diversity within that group. Specifically, you should see a diverse mix relating to age, sex, etc. Ideally, you want to create a test group that matches the diversity within your target audience.
Having a diverse test group matters because different people experience and use digital content differently. There’s a lot to be learned from watching how different people interact with your course.
Don't Explain How to Use the Course
Once you’ve gathered your group of testers, the second step in conducting user acceptance testing for eLearning is not to explain how to use the course they’re testing. While you might provide some context about what your testers are doing, you don’t want to give any more details than the average learner who will be taking your course once it’s published.
Let your testers interact and explore the course on their own. Through this process, you’ll quickly identify what areas of the course aren’t intuitive and what might need additional work.
Observe & Take Notes
Once your testers have started testing the course, the third step in conducting user acceptance testing for eLearning is to simply observe and take notes. Your job is to watch how your testers navigate the course and, more specifically, where they get stuck. Take note of how your testers are struggling, that way you can later refine those areas of the course before publishing.
In those situations where your testers get stuck (and some will), ignore your instinct to help them. The goal is to learn how you can improve the design of your course; and in real-life, you won’t be able to help all of your learners navigate your course.
Debrief & Ask Questions
Once your learners have successfully (or unsuccessfully) completed the course, the fourth step in conducting user acceptance testing for eLearning is to debrief and ask questions. It’s at this point, you get to hear what your testers thought of the course. If any of your testers got stuck, why? Where they able to get unstuck, if so, how? The answers to these questions can help you uncover simple fixes to improve the user interface design of your eLearning course.
This is also an excellent time to see how your testers responded to the learning content. What did they know about the topic before taking the course, and what was learned after completion?
Don't Justify Your Design Decisions
The final step in conducting user acceptance testing for eLearning is not to justify your design decisions. This is especially important when you’re receiving feedback from your testers. While your impulse might be to explain what the testers did wrong or what your intentions were, don’t do it—just keep your mouth shut.
The goal is to learn from your testers to improve the design of your course, not to correct what they did or did not do wrong during the testing process.
What other tips can you share for conducting user acceptance testing for eLearning? Share them by commenting below!
Nice article.