Putting it All Together: The Benefits of Using Distractor Analysis

August 4, 2015 Aaron Dewald

This post and accompanying video is the final part of a series of four by Aaron Dewald, Associate Director, Center for Innovation in Legal Education, University of Utah College of Law. Watch his last video, and then read his article below regarding item distractor analysis.

 

Hey everyone and welcome to the final post in this introduction to item analysis series. Remember, this is simply a high-level view, and there’s much more to learn about reliability, validity, and even the cognitive levels at which we write our questions.

The Benefits of Distractor Analysis

One of the benefits to conducting this analysis is that it can help us understand how our alternatives are performing. When we run an item analysis report, we’ll have a section that tells us the percentage of test takers that chose each individual alternative. This is very, very valuable information. Let’s say that we have five possible alternatives for our question. One is the answer, the other four are distractors. When we look at the report, let’s pretend most of the students are choosing either the right answer, or two of the remaining four options. Hmm. It seems that two of the distractors are working well but the others are not. What could be wrong?

To help us out, we’re going to turn to a bit of research done by Drs. Haladyna and Downing. They have done a tremendous amount of research in analyzing guidelines for writing effective multiple choice questions. If you’re curious, I suggest reading “Validity of a Taxonomy of Multiple-Choice Item-Writing Rules (1989).

Guideline: When it comes to alternatives, quality over quantity

How many answer options should I provide? Two? Five? This has a huge impact on how we write our exams. More alternatives means more writing time for teachers. It means more time for the students to consider each alternative.

Turns out the answer is, “As many plausible alternatives as you can write, but three is often sufficient.” What does this mean? This means no “joke” or obviously wrong answers. This means each answer option should be carefully considered and written. If three alternatives (the correct answer and two distractors) is all you can come up with, that’s fine. The previously mentioned analysis found that item discrimination was not greatly affected by the number of answer options available to the students, when the number of answer options was three or more. The key really is quality over quantity.

Guideline: Look at both the responses for top and bottom 27%

The next thing we’ll talk about lets us lean a little bit on item discrimination. When we’re looking at our report, it’s a good thing to look at the response distribution of your top and bottom 27% separately. When we break out the responses of these groups, it gives us some insight into how those students are choosing between the alternatives. If there’s a question with a particularly high index of discrimination, that’s a good place to start. How are the bottom 27% of students answering? Are they fixating on one incorrect alternative? This might mean there’s a misconception among our bottom students. Maybe the question is too difficult, or they’re simply guessing. We can look at which distractor they are choosing and then think about why these students might think that is correct. These results can advise you on subtle things to adjust as you teach the topic in subsequent years.

Did you enjoy this series on item analysis? Subscribe above to have everything delivered straight to your inbox, or, register here for an upcoming Twitter Chat hosted by Aaron himself!

 

About the Author

Aaron Dewald

Aaron is currently a Ph.D. candidate in learning science, which gives him a unique perspective on technology use in pedagogical situations. Aaron received his B.S. in Information Systems from North Dakota State University in 2001, and his M.Ed. in Instructional Design and Educational Technology from the University of Utah in 2010.

Follow on Twitter More Content by Aaron Dewald
Previous Article
Tips for Administering Exams on iPads in a Clinical Setting
Tips for Administering Exams on iPads in a Clinical Setting

Using iPads to administer student exams in a clinical setting can be a challenge, but can also ease the ass...

Next Article
Twenty-Seven Percent: Describing the Index of Discrimination
Twenty-Seven Percent: Describing the Index of Discrimination

Index of discrimination determines how well the question can tell the difference between high and low perfo...