Guide to Improving Response Rates

Author Bio

Author Bio

Guide to Improving Response Rates

Response rates are a popular topic among Customer Experience (CX) and Market Research professionals. Most businesses continually strive to increase survey response rates, yet is it a reliable metric? Not only are response rates driven by an array of factors, but forecasting is complex and the range varies dramatically.

Fortunately, there are several metrics we can use to determine the validity, precision, and usefulness of a sample. There are also ways to identify the source of low response rates – meaning we can improve them, increase the quality of the sample and – most importantly – maximise the value of the information the programme yields.

Reasons for Low Response Rates

The recent decline in response rates may well be a problem in terms of both survey cost and validity, but also stresses the importance of analysing any contributing factors. In B2B customer research, responses tend to be higher as they have a strong relationship with customers. Yet we continue to see challenges in the feedback process.

According to the Pew Research Centre, response rates on telephone surveys have been on a decline since 1997. Whether it’s internal business processes, respondents or the survey itself, there are a whole host of reasons that could affect your response rate.

  1. Customers inundated with feedback requests Surveys are pervasive and with the relative ease of launching surveys, there is a glut of requests for feedback. Many surveys are poorly written and do not resonate with the customer. Over time it becomes difficult to rise above the noise.
  2. Time is at a premium Customers are busy. Not only are we often asked to do more with less, but respondents have less time to provide feedback.
  3. Little evidence of ROI When a customer provides feedback yet sees no changes, their time invested spent answering the survey was wasted. In the long run, feedback requests are put at the bottom of the priority list.
  4. Poor respondent engagement The customer is simply not engaged in the relationship overall, suggesting a revenue flight risk. Or the survey content is not relevant to the customer’s role, leading to early termination of the survey.
  5. Surveys are too long The days of excruciatingly lengthy, involved, complex surveys are long gone. Fortunately, with the right planning and platform, you can still gain considerable insight from shorter surveys.
  6. Your company may not be engaged You should consider how you engage with customers e.g. do you adopt a high-touch model, or is yours more passive? High-touch models tend to yield higher response rates.
  7. Quality of customer database You must look at your customer data infrastructure. When the sample file has a considerable amount of dated, unusable sample, the overall response rate suffers.

The good news is that we can address all of these concerns. With a little time and effort, you will be rewarded with stronger customer relationships that will separate you from your competition.

Validity, Reliability and Usefulness

We often ask “Are my results valid?” and “Are my results reliable?” Statistically speaking these are two different questions. Validity refers to the whether a metric measures what it should measure. Reliability refers to consistency – is the metric stable?

Perhaps we should ask “Are my results useful and/or usable?” To determine the usability of your results, consider the following:

  • Representative sample: Is the sample representative of the entire population being studied?
  • Degree of response variance: Are the responses to our survey questions tightly clustered or widely scattered?
  • Desired confidence level: Is 90% good enough? Or are we striving for 95% confidence?
  • Business validity/utility: Business intelligence insights can still be meaningful and impactful with small sample sizes. Our job is to guide business decision making.

What is a Good Customer Survey Response Rate?

While there is no such thing as a perfect sample or a perfect response rate, we know the question will be raised… What is a good response rate? And how does our programme compare? The reality is there’ is a wide variance across programs. And don’t let promises of response rates of 48%, 65% or 82% convince you anything is “wrong” with your programme.

Achieving a statistically valid response volume and using customer feedback to drive lasting, positive business change is more important than driving a sky-high response rate. A response rate is a means to an end, not the end itself!

Understanding the response rate drop off helps us focus our improvement efforts. Simply put, the response rate is the ratio of completed surveys to the number of sent surveys:

Response rate = No. of Surveys Completed / No. of Invitations Sent

However, taking a look under the hood reveals more to the equation. Four gates impact the denominator of the response rate calculation:

  • Email Delivery Rate: an indicator of sample quality.
  • Email Open Rate: how many of the invitations sent are opened by the recipient - generally considered a measure of how compelling the email subject line is.
  • Email Click-Through Rate: how many people clicked the link in the email.
  • Survey Completion Rate: how many that opened the survey actually completed it. It is used to assess survey length and/or relevance of survey content.

Think of this as a response rate funnel, with each stage providing insight on where to focus improvement efforts:

What can be done to improve the response rate?

The obvious question remains - how can we achieve – or maintain - a high rate of response? Here’s how you can maximise your response rates:

Clean your database first. The old adage is true – garbage in, garbage out. Data hygiene should not only be considered before launching a survey, but should be ongoing. Segment your contacts by role so you can align the right questions with their area of responsibility. The customer database is a key strategic business asset that, if cared for, can provide a greater return.

Improve pre-survey communication. Whether it’s by email, website banners, sales team communication, pre-survey outreach to customers can yield better survey response. It should explain why you are doing the survey, what you will do with the data and, if you have done prior waves, what you have changed as a result of the feedback. Follow-up with non-responders to personally ask for participation. Top tip - in all cases, make certain that you avoid the “please respond and give me all 10’s” at all cost!

Don’t be afraid to test approaches. A/B testing can lead to better response rates. This is particularly important if your analysis suggests that open rates on email invitations are an issue.

Don’t just “spray and pray”. Launching invitations and waiting for the surveys to roll can be a recipe for disappointment. Spend time monitoring the delivery, open, click-through and completion rates. Getting an early warning of potential issues (and the corrective actions needed) can prevent problems later in the process.

Watch your language. Avoid research-oriented words such as survey or questionnaire to avoid being intercepted by spam detectors. It’s also impersonal and doesn’t encourage customer participation.

Start the survey in the invitation. Nesting the first survey question in the body of the invitation can encourage them to start the survey straight away, also ensuring the most important question is answered first.

Keep it short and targeted. For each question in a survey, ask yourself “What do we want to learn, what will we do once we learn it, and who will own it?” If you cannot answer all these questions, reconsider including them.

Don’t ask what you already know. It sends a message that you don’t know the customer or value their time. These data points should be appended to the sample data and used as routing variables in the survey. Manage survey frequency. Sophisticated Voice of the Customer solutions have the ability to actively manage the cadence of surveys sent to individuals, minimising the over-surveying burden, survey fatigue, which is increasingly common.

Thank the customer. After the survey, thank the customer for their time and feedback. Include a recap of what you plan to do with the information and avoid an impersonal approach that looks like a canned auto-response.

Close the loop. Encouraging company reps to close the loop with the customer will promote accountability. Teams should be trained on the best, most appropriate way to use this feedback - particularly when it is negative.

Act on the insight. We cannot stress this enough – making changes that improve the customer experience will not only be good for business but will also help increase response rates over the long run.

Engage in better post-survey communication. A best practice is to communicate what was learned in the programme and what initiatives you have underway. Create a website where customers can see the status of various initiatives. This offers a level of transparency that encourages participation.