Manuscript with arrow icon Book and magnifying glass icon Cross-check icon Process checklist icon Reputation ribbon icon Graduation cap icon Question speech bubble icon Headset call icon Mobile phone call icon Login arrow icon B+ Paper Icon Feedback Speech Bubble Icon Similarity Check Icon Professional Development Icon Admin Training Icon Instructor Training Icon Student Training Icon Integrations Icon System Status Icon System Requirements Icon Menu Icon Checkmark Icon Download Icon Rubric Icon Prompt Icon QuickMark Set Icon Lesson Plan Icon Success Story Icon Infographic Icon White Paper Icon White Paper Icon Press Release Icon News Story Icon Event Icon Webcast Icon Video Icon Envelope Icon Plaque Icon Lightbulb Icon Training Icon Search Icon Turnitin Logo (Text and Icon) Icon Facebook Icon Twitter Icon LinkedIn Icon Google Plus Icon Lightbulb Icon Binoculars Icon Drama Masks Icon Magnifying Glass Icon Signal Check Indicator Bars Red Flag Icon Analysis and Organization Icon
Contact Sales

Multiple-choice exams are a part of our assessment landscape, for better and for worse. Critics eschew the format’s shortcomings when it comes to formative assessment and question its efficacy in actually promoting learning. This Edutopia article on the “the dark history of multiple-choice exams” states, “Multiple-choice tests are not catalysts for learning” and that “they incite the bad habit of teaching to tests.”

Criticism of multiple-choice exams includes the following:

  • Multiple-choice exams do not accurately assess learning. If students can narrow down choices, even to just two final possibilities without conceptual understanding, there’s a huge possibility that students will pick the correct answer, purely by chance.
  • Multiple-choice exams rely on recall and not higher-order thinking. Students often don’t have to display concept understanding with responses.
  • Multiple-choice exams are largely summative as an assessment tool--students don’t often receive feedback on such tests.

In such a scenario, when students choose correct answers on a multiple-choice exam, how can you accurately assess student learning gaps?

On the other hand, supporters of multiple-choice exam formats laud its efficiency, especially for large lecture courses. Many say that they are the easiest assessment type to grade, especially with machine-grading support. And that there is a lack of bias in grading such exams. Add to this student preference for multiple-choice exams (students find multiple-choice exam questions easier to navigate through the process of elimination)--and you’ve got yourself a very popular type of assessment.

So--given the prevalence and likely permanence of multiple-choice exams, how can one mitigate the exam format’s limitations?

Here are three valuable ways instructors can enhance their multiple-choice exam questions:

1. Mitigate the ways in which students can eliminate choices without full conceptual understanding.

Double-check grammatical consistency. For example: make sure the verb tense in all of your choices matches the verb tense in your question. Don’t ask for a noun answer and then have a verb as an option. Or don’t ask for a plural answer and then have a singular option in the responses.

Sample Question:
In the following reaction
4NH3 + 5O2 --> 4NO + 6H2O
The element being oxidized and the oxidizing agent are:

Before (non-matching answer in bold):
(a) N and NH3
(b) N and O2
(c) O and NH3
(d) O3

After:
(a) N and NH3
(b) N and O2
(c) O and NH3
(d) O and O2


Consider giving your multiple-choice test to someone not taking your course--and see how many questions they get correct. Ask if something tipped them off to the right answer, such as the aforementioned verb tense (Weimer, Faculty Focus, 2018).


Write very plausible responses (aka “distractors”). Good distractors can benefit learning because students will have to enact deeper understanding in order to come up with their answers.

Sample Question:
Which of the following artists painted the ceiling of the Sistine Chapel?

Before (poor distractors in bold):
(a) Pollock
(b) Michelangelo
(c) Dr. Seuss
(d) Disney

After:
(a) Raphael
(b) Michelangelo
(c) Da Vinci
(d) Donatello


Avoid using absolute terms like “All, never, always” in distractors--they’re an indicator of incorrect answers. Instead, use words that are more nuanced and encourage students to use their knowledge, such as “most likely” or “what is the best--.”

Sample Question:
In The Lord of the Rings, what is Frodo’s mission with the ring?

Before (answers with absolute terms in bold):
(a) To never listen to the ring.
(b) To always wear the ring around his neck.
(c) To let no one know he has the ring.
(d) To destroy the ring in the fires of Mount Doom.

After:
(a) To avoid listening to the ring.
(b) To wear the ring around his neck.
(c) To let only a few people know he has the ring.
(d) To destroy the ring in the fires of Mount Doom.


Make each response similar in length and detail. (Xu, Kauer, & Tupy, American Psychological Association, 2016).


FYI--according to research, the most common location for the correct answer is likely C or D. So definitely randomize your correct answer locations (NMBE, 2001).


2. Test for deep conceptual knowledge

True/False responses focus on recall and not deeper understanding. Avoid true/false questions.


Design tests that are challenging, but not too difficult in order to measure student learning accurately (Butler, 2018).


Do an item analysis--see if a question is being missed by students with overall high exam scores.


3. Benefit student learning

Avoid using “None-of-the-Above” and “All-of-the-Above” as response options. NOTA is harmful when it is the correct answer--because students don’t have to know all the right answers; they just have to eliminate incorrect responses (Butler, 2018).


Provide feedback after assessment. Review the test. If you’ve provided good distractors that many students selected, this is an opportunity to address them and further student learning and teaching efficacy.


We hope that this helps you in designing multiple-choice exams that pivot towards formative assessment.

Check out Gradescope