3 hot resources for best practice multiple-choice quizzing

In my previous post, 14 reasons why your multiple-choice quiz sucks, I listed typical clangers whose only purpose is to render your assessments ineffective.

If they’re the bad and ugly aspects of MCQ design, what’s the good?

To answer that question I hit Google and a couple of academic databases, but mostly in vain.

It may be due to my poor researching skills, but I found very little empirical evidence of best practice multiple-choice quizzing. Plenty of unsubstantiated opinion (of course) but not much science.

Cartoon

You see, Google wasn’t much help because “best practice” is frequently confused with “common practice” – but it’s not the same thing.

The peer-reviewed literature wasn’t much better. Alarmingly, many of the studies were inconclusive, adopted a flawed experimental design, and/or didn’t compare the performances of the quiz-takers on the job under the different treatments – which is the whole point!

However, through a combination of persistence, serendipity and social networking, I finally uncovered 3 resources that I consider worth recommending: a journal article, a website and a blog.

1. A Review of Multiple-Choice Item-Writing Guidelines for Classroom Assessment – In this article, Thomas Haladyna, Steven Downing & Michael Rodriguez validate a taxonomy of 31 multiple-choice item-writing guidelines by reviewing 27 textbooks on educational testing and 27 research studies. If you want insight into the myriad of MCQ variables, here it is.

2. QuestionMark – David Glow and Jayme Frey independently pointed me to the wealth of resources on this website. QuestionMark is a business, granted, but they know what they’re talking about – a claim backed up by the fact they have a psychometrician on the payroll (cheers David) and I heard Eric Shepherd with my own ears at LearnX last year and was very impressed.

3. The eLearning Coach – Connie Malamed is a qualified and experienced e-learning designer whose blog provides advice to fellow practitioners. I value Connie’s expertise because it is practical and she has implemented it in the real world.

If you are aware of other good MCQ resources – preferably evidence based – please share them here…
 

2 thoughts on “3 hot resources for best practice multiple-choice quizzing

  1. I did a quick search and came up with the following, I’m thinking the first one is the key :)

    Use RadioButtons. Big ones.

    A Q&A may be out of date:
    http://www.uwo.ca/tsc/re_multiple_choice.html

    A statement that just seemed make sense but something we tend to do for self assessment only:
    Feedback enhances the positive effects and reduces the negative effects of multiple-choice testing

  2. Thanks Gary.

    Using big radio buttons is a great idea. It’s frustrating when you try to click the right answer but it won’t take!

    Do I also infer that you prefer radio buttons over check boxes? If so, I agree. Those questions with multiple correct answers are nothing but trouble.

    Cheers for the Q&A. That recommendation for only 3 answer options (1 correct + 2 distracters) is very interesting. I did notice a couple of papers about that specific variable, but unfortunately I haven’t yet had time to devote to them. When I finally get around to it, I’ll share my learning here.

    I also agree with the sentiment about feedback, though I’m wondering if a point of difference exists between formative and summative assessment. I am keen to look into this in more detail, but at the moment my general consideration would be that we need to inform the learner of what they got wrong so they don’t take the knowledge gap into the workplace. However another part of me thinks if the learner relies on feedback in the summative quiz, it’s probably too late and may point to a deficiency in the preceding instructional design. A conundrum… does anyone else have a view?

Leave a comment

This site uses Akismet to reduce spam. Learn how your comment data is processed.