Researchers sometimes ask sensitive questions in surveys. Respondents are often hesitant to answer sensitive items, so item nonresponse on these questions is normally higher than for other questions in a survey. Some respondents may even stop taking the survey because a sensitive question turns them off from the process. This tipsheet deals with two types of sensitivity: social desirability and privacy.
A common source of bias in surveys is social desirability bias. This refers to the tendency of respondents to overreport socially favorable attitudes and behaviors on sensitive questions. For example, it is socially undesirable to hold racist or sexist attitudes. Many respondents who hold these attitudes realize that their opinions are considered “bad” by social norms, so they are hesitant to admit them even in the relatively anonymous setting of a survey. Such respondents will report more egalitarian beliefs than they really hold.
The same logic applies to behaviors. Wearing a seatbelt while driving is a “good” behavior, but many people do not do it. Respondents may tend to overreport seatbelt usage, causing the survey to overestimate rates of seatbelt wearing.
Sometimes social desirability is not a problem with a question, but respondents may still be uncomfortable answering it because they are concerned with privacy. This is especially the case with questions about politics, religion, and demographics. Many people cling to the belief that politics and religion are not spoken about in polite company, so they may be uncomfortable even in the survey setting disclosing for whom they voted or what church they attend. Even basic demographic questions such as race, sex, age, and income are often met with a “that’s none of your business” attitude from respondents.
Sensitivity is difficult to combat, but there are some techniques survey researchers can use to reduce bias that results from it:
Anonymity and Confidentiality
Always reassure respondents about their anonymity or confidentiality in the introduction to the survey. Remind them of these assurances later in the survey when introducing sensitive questions. Researchers may even want to state explicitly that no one (outside of the research team) will ever be able to match respondents’ identities to their answers. For demographic questions, it sometimes helps to say that these questions are asked for analysis purposes only. Respondents may be put at ease the more researchers can reassure them of their privacy, so repeat these reassurances as often as needed.
Avoid putting sensitive questions too early or too late in the survey.
It is generally not a good idea to start the survey with any question that touches on something private. When respondents start a survey, they are generally not drawn into the process yet or committed to finishing it. Sometimes respondents start a survey to see if the first few questions are interesting, then decide whether it is worth finishing it. Putting a sensitive question up front immediately raises a red flag with respondents who have privacy concerns and increases the likelihood that they will break off the survey. It is better to lead the questionnaire with simple items that draw respondents into the survey process and engage their interest. If there are no other viable alternatives, it is acceptable to start the survey with simple demographics, but this approach is not ideal. Never put a question with social desirability concerns first.
Placing sensitive items at the end of the survey is not a great idea either. Dealing with sensitive questions can be unpleasant for many respondents, even if they choose not to answer them. Researchers should not risk ending the survey with respondents feeling suspicious or offended, especially if the research plan involves recontacting respondents in the future. Generally, it is acceptable to end the survey with demographics as these are usually the least important items and raise the lowest level of privacy concerns, but never end a survey with questions that raise social desirability issues or that ask about actions or attitudes that might be considered private.
Sensitive questions should be placed around the middle of the survey, usually half way to twothirds of the way through the questionnaire. Putting them here lets the researcher draw the respondent into the response process and build a base level of trust before sensitive topics are raised. This strategy also leaves room for less sensitive questions to be posed closer to the end of the questionnaire. Again, if you must start or end the survey with demographics, that approach is acceptable. However, always put the most sensitive items closer to the middle.
Make respondents feel comfortable telling the truth.
Do you support or oppose drawing school attendance zones to make schools more racially diverse?
The question above is a simple one, but it raises concerns with social desirability. Respondents may feel pressured to answer that they support this type of zoning plan. If respondents oppose the idea because they hold racist beliefs or are uncomfortable with racial diversity, they may feel that giving an “oppose” answer reveals their true attitude to the interviewer and makes them look racist. Thus, they answer “support” instead to avoid revealing a socially undesirable attitude.
Some respondents may oppose diversity-based districting plans for reasons that have nothing to do with race. Perhaps they know that these plans often involve bussing students far across a city in order to diversify schools, and would prefer that students (including their own, maybe) attend a neighborhood school that is closer to home. These respondents may also feel pressured to answer “support” because they fear that an “oppose” answer makes them look racist, even though their attitude has nothing to do with race per se.
The researcher wants to make respondents feel comfortable enough to reveal their true attitudes, so the question can be reworded or introduced in a way that makes an “oppose” answer look more acceptable. This might be one alternative wording:
Race is a topic that makes many people uncomfortable, but the government often makes policies that deal with racial issues. We’d like to know how you feel about some of these policies. There is no right or wrong answer, and you can choose not to answer the question for any reason. Do you support or oppose drawing school attendance zones to make schools more racially diverse?
The same approach also works with behaviors. The first question below is a straightforward item about whether the respondent voted in the last election. Many people lie about voting, however. “Good citizens” are supposed to vote, so respondents are often wary of admitting that they skipped an election. This is one reason why political surveys tend to significantly overreport voting rates. The second question below frames the voting question in a way that signals to the respondent that not voting is an acceptable response.
Did you vote in the November election?
In talking to people about elections, we find that a lot of people were not able to vote because they weren’t registered, they were sick, or they just didn’t have the time. How about you? Did you vote in the election this November?
The mode in which a survey is deployed significantly affects how sensitivity shapes the results. Respondents are more hesitant both to answer sensitive questions in the first place and to answer them truthfully in modes where a human interviewer is present. Thus, social desirability bias and privacy concerns are bigger issues in face-to-face and phone surveys. Respondents feel less anonymous in these formats and prefer to project a positive image to the interviewer, so they are less willing to disclose sensitive information.
However, surveys administered without a human interviewer reduce sensitivity effects substantially. Respondents are more willing to disclose private and socially undesirable information about themselves in web and mail surveys where there is not the pressure of maintaining a positive image in front of an actual human being. Self-administered modes do not eliminate social desirability and privacy concerns, but researchers who are especially interested in asking about sensitive topics should give serious consideration to the merits of using them (or even a mixed-mode design). Self-administered surveys overall tend to yield lower data quality and lower response rates, but these tradeoffs may be worth greater disclosure on sensitive items.
Author: Patrick R. Miller, DISM Survey Research Associate