Multiple choice question creation is challenging and time-consuming. While question banks from the publisher can certainly benefit students by providing opportunities for practice, I prefer to create my own questions for exams to make sure they align with my learning objectives and the material I emphasized in class. While one of the main benefits of using multiple choice questions is the ease and speed of grading, one of the drawbacks is the amount of time invested in the development of each question. Various sources report that professional test item writers spend 30 minutes to 1 hour on just the first draft of a multiple choice question (1, 2). In some ways these numbers are reassuring as I know I am not alone in my occasional struggle with creating plausible distractors (choices) and targeting higher level thinking and reasoning.
During last week’s Faculty Technology Institute, TLT offered a session on best practices to improve multiple choice questions and exams. A number of tips resonated with the participants and led to a vibrant discussion. With multiple choice questions being so common, I thought the rest of the college community might gain new ideas from some of these discussion points. The summer is the perfect time to look back on exams from the past year and evaluate their effectiveness at measuring student learning. You might consider carrying out an item analysis on some of your questions to evaluate the difficulty and discrimination (3). Here are a few practical things to consider if you plan on revising your multiple choice questions:
3 options are optimal (in most cases).
A meta-analysis of over 80 years of research concluded that 3 options, or choices, are optimal for multiple choice questions (4). The analysis examined item difficulty, discrimination, and reliability and concluded that 3 options is best in most settings. I found this paper fascinating, and I was pleased to find out that I can spend less time trying to come up with plausible distractors for each question, while at the same time reducing the reading burden for students. Wahoo!
Question order does not influence performance or completion time.
The majority of research on this topic indicates that question order has no effect on performance or completion time (5). This is great news as scrambling question order is one strategy adopted by many instructors to prevent cheating. Interestingly, students may perceive exams with randomly ordered questions as more difficult than chronologically ordered exam questions (5). This might be something to keep in mind if you often hear from students that your exams are really difficult.
Following all the item writing recommendations is really hard.
- Avoid absolutes (always, never, all, none, all of the above, none of the above, etc).
- Avoid negatives (all of the following except, which of the following is not true, etc.).
- Avoid imprecise terms (usually, sometimes, rarely, etc.).
- Keep the stem of the question succinct.
- Keep distractor length consistent.
I know my past exams have included “none of the above” or “all of the above” as options. Faculty attending TLT’s sessions on writing multiple choice questions have commented that these recommendations can be hard, sometimes impossible, to follow. I am hoping to reduce my item flaws by cutting question options down to three.
Poorly constructed questions and exams negatively affect students, and they interfere with interpretations of the exam results. As an instructor, I want to make sure that my questions are reliable and valid. In addition to wanting my exams to align with my learning objectives, I want my exams to be a reflection of student learning in my course and not a measure of reading ability or test-taking savviness. The recommendations listed above have led me to rethink my exam format and reconsider some of my test questions. I hope they are useful to you too!
(1) Van Hoozer, H.L. (1987). The teaching process: theory and practice in nursing. Norwalk, Connecticut: Appleton-Century-Crofts.
(4) Rodriguez, M.C. (2005). Three options are optimal for multiple-choice items: a meta-analysis of 80 years of research. Educational Measurement: Issues and Practice, 24, 3-13.
(5) Pettijohn, T.F. and Sacco, M.F. (2007). Multiple-choice exam question order influences on student performance, completion time, and perceptions. Journal of Instructional Psychology, 34, 142-149.