We often get requests to provide feedback on surveys. As we review the surveys, we tend to see three challenges occur quite frequently. I’ve written this blog to offer suggestions for addressing each of the challenges.
1. Double-Barreled Questions. In a double-barreled question, two concepts or ideas are included in one survey question. For example, the three survey items below are all double-barreled questions:
○ “How much of your monthly income is spent on restaurant dining and entertainment?”
○ “How often do you volunteer your time or make charitable donations?”
○ “How often do you consume beer and smoke cigars?”
But how does a respondent answer a double-barreled question? For example, if a family spent no money on restaurant dining but 10% of their income on entertainment, how should they respond? 0%? 10%? Average the two and respond 5%? Usually survey respondents do provide an answer, but they often utilize an individualized strategy for finding a response, which can be challenging for program staff to interpret.
Double-barreled questions can be addressed by limiting each survey item to one – and only one – concept. Below is a double-barreled question followed by revisions that eliminate the double-barrel:
|How much of your monthly income is spent on restaurant dining and entertainment?
||How much of your monthly income is spent on restaurant dining?
How much of your monthly income is spent on:
- restaurant dining?
- other (non-food) entertainment?
2. Imbalanced Response Options. Imbalanced response options refer to a set of response choices that do not cover the range of possible choices. An imbalanced response scale is problematic because it limits respondents’ responses to only one end of a scale and may produce inaccurate results. For example, if a respondent wants to answer “strongly disagree” to a question but that option is not available, the survey is not capturing the respondent’s perspective. Using a set of balanced response options ensures that respondents’ answers can be accurately captured. A very good resource for a variety of balanced response scales can be found here: http://www.gifted.uconn.edu/siegle/research/Instrument%20Reliability%20and%20Validity/Likert.html
Below are some examples of imbalanced response options with improvements to the scale:
3. Overlapping Response Options. Overlapping response options simply means that there is some overlap in the response choices available to the respondent. For example, the following survey item has overlapping response options:
How many books did you read in the last month?
0 or 1 book
1 or 2 books
2 or 3 books
3 or more books
Overlapping response options are problematic because they can confuse the survey respondent. If you read two books in the last month, which option would you select: “1 or 2 books” or “2 or 3 books”? In addition, questions with overlapping response options are a challenge to interpret. How do you know how many books were read if the response options overlap? Overlapping response options can be addressed very easily by carefully constructing the response option list. Simply revise and edit the list so that there is no overlap between the response choices.
If you have any questions about the above challenges to designing a survey, please feel free to leave a comment or email me at Kirsten@acetinc.com. Please also contact any of us at ACET if you have questions about survey design in general.