Response rates to surveys have been dropping for many years. This fact is not new to those involved with tracking studies, usage and attitude studies, and other types of marketing research.
A recent study from the Pew Research Center for the People & the Press, called “Assessing the Representativeness of Public Opinion Surveys”, points out the magnitude of the decline.
According to the study, the response rate to telephone surveys (defined as the percentage of households in a sample that are actually interviewed) has fallen to 9% in 2012 from 36% in 1997. Some of the main reasons for response rates to surveys falling so dramatically include:
- The rise of Caller ID, which makes it easier to screen out unwanted calls.
- The prevalence of cell phones, which make surveys more difficult in a variety of ways. For example, unlike land lines, cell phones are often single user phones; if the user of the phone is not there, or is not the right respondent, it is more difficult to ask to speak with someone else.
- The rise of non-traditional schedules in today’s family, making it more difficult to reach family members at home.
Does this mean that you should cancel your upcoming surveys because of concerns of data quality? Far from it.
The news over recent years isn’t all bad, and in fact is quite good in two regards.
First, there is little evidence that the decline in telephone survey response rates has tended to reduce the representativeness of surveys or increased response bias. According to the Pew study results, surveys that are properly conducted provide accurate data on most key measures that matter to managers. In the words of the report, “…despite declining response rates, telephone surveys that include landlines and cell phones and are weighted to match the demographic composition of the population continue to provide accurate data on most political, social and economic measures.”
The Pew Research Center study compares two different types of new national telephone surveys. One was conducted with a typical survey methodology, which yielded a 9% response rate, the other study used additional effort over a longer period of time to achieve a 22% response rate. The two studies yielded similar results, which is good news for surveys.
Second, the drop in response rates is not all bad news because this drop is focused on telephone surveys. The drop in response rates for telephone surveys over the past 15 or so years is paralleled by the advent of online surveys, primarily conducted through Internet-based survey programs and online respondent panels.
Online surveys offer a number of advantages over telephone interviewing, including the ability to show respondents pictures or video during the interview, as well as the ability to include other types of graphics-based interview methods.
For additional perspective on the Pew study, you might look at this very good article in Slate.com by Will Oremus, entitled, “Minority Opinions: Hardly anyone responds to public opinion surveys anymore. Can we still trust them?” Mr. Oremus concludes, and we agree with him, that surveys are still a good way to collect data. If you’d rather hear Mr. Oremus talk about it than read it for yourself, you can listen to his NPR interview.
The bigger question for us is what can we do about response rates? Given that marketers, managers, and pollsters rely upon survey data, how can we increase the response rates for our survey?
Here are 10 suggestions to make sure the response rate on your next survey is as high as possible.
- Screen well. Almost every survey will have screening questions to qualify a respondent. Screen efficiently, and disqualify those who do not fit your respondent profile in as few questions as possible. This saves you both time and money. So, if you want to talk to the person who does most of the grocery shopping, ask that question first. If it’s a survey about diapers, ask the ages of children first. Or if the topic is auto insurance, ask vehicle ownership first, even before you ask gender and age.
- Keep your survey short. You can gain a lot of information in a 10-minute survey if you stay focused on what you need to know and cut back on the nice to know. Similarly, try to eliminate some of the redundant questions that get at the same information in different ways.
- Be honest about length. Tell the respondent how long the survey is expected to take. Be honest with this. Don’t tell a respondent it’s a 10-minute survey if it’s actually 25 minutes long. If they then leave after 10 minutes, you’ve lost time and money to replace that interview. Or they may tune out and provide poorly considered or random responses; we call that “respondent fatigue”. This is worse than someone leaving the interview because now you may have bad data.
- Use phone surveys to measure awareness. A telephone survey is a good way to ask for unaided awareness of brands and advertising because it’s truly a top-of-mind response. In open-ended questions (such as “Why did you say that?” or “What do you remember about that ad?”), a phone survey can provide rich and complete responses. The presence of an interviewer means that we can clarify confusing answers, and ask follow-up probes, such as “Can you explain what you meant by that?” or “Are there any other reasons you prefer this brand?”
- Split a long survey in two. Consider running two shorter surveys instead of one long survey. You can ask awareness questions on the phone and usage and attitude ratings in an online survey.
- Use the right scales. Rating scales are commonly used in marketing research surveys and there are two basic ways to describe the scale: an anchored scale and a descriptive scale. We often use anchored scales in telephone surveys because they are easier for a respondent to remember the range from 7 to 1 than the descriptions of seven scale points. Online, the descriptive scale works very well.
- Keep lists short, especially in phone surveys. Long lists of attribute ratings get particularly tedious on the telephone and respondents may not pay very close attention after rating 10 or 15 items. Limit the length of the lists. If your survey must have ratings of multiple brands, importance ratings, and attitude ratings, it is probably better if done in a self-administered survey online.
- Ask questions that are easy to answer. Your data will be better if questions are easy to answer. If a respondent answering your questions has to check what’s in their pantry, or find their latest utility bill, the time that they spend checking will count against the total time that they are willing to spend on your survey before fatigue sets in.
- Include a “Don’t Know” response. Respondents typically want to be helpful. Under some survey conditions, they may provide answers to a yes/no question even if they would prefer to answer “don’t know”. To avoid forcing respondents into a choice they don’t really want to select, be sure to include “Don’t know” and “Prefer not to answer” options in your answer list.
- Order your survey carefully. A sensitive question is any question that people may not want to share with the general public (such as income), or information that could affect how they are perceived by others. Ask easier questions at the start of a survey, and more difficult or more sensitive questions at the end of the survey. A respondent may feel more comfortable sharing personal information later in the survey.
Following these 10 tips, you can keep the response rate to your survey as high as possible.