Skip to content

Blog

How to boost survey response rates from businesses

24 April 2024

Rob Fuller

Share this page

Have you ever sent out a survey to businesses and heard… nothing? You’re not alone. Low response rates plague researchers, particularly those targeting busy executives. A review carried out in 2010 found that response rates are lower for managers than for non-managers, and particularly low among top management/executives. In the Business Basics Programme, the response rate to surveys of SMEs averaged 43%, and was as low as 19% in one project. On the other hand, some projects have been able to achieve good response rates – e.g. 76% in one of the Business Basics projects and 89% in the EU-funded DepoSIt project.

Low response rates pose multiple threats to research. They shrink your sample size, weakening the statistical power of your study. Worse, they can introduce bias in your results if missing responses aren’t random – e.g. if your control group is less likely to respond than the treatment group. And they also limit the applicability of your results, making it hard to know if your findings apply to the entire population you studied.

IGL’s experience of supporting dozens of RCTs with SMEs over the years has given us some indications of what does and doesn’t work to improve response rates. Some of the most effective approaches are:

Only a couple of IGL-supported projects have tried providing direct incentives in return for responding to surveys (e.g. being given a gift voucher or entered into a prize draw), and it’s not clear what impact these have had. But there is some evidence that such financial incentives can increase response rates from businesses. A review in 2009 found several studies in which financial incentives resulted in marked increases in response rates over a control group. Most of the cases they examine involved prepaying incentives, rather than promising a payment later. Only one study compared prepaid to promised incentives, and this did not find any clear evidence of a difference in response rates between the two.

We certainly don’t have all the answers, and there’s lots of potential for experimenting with the best approaches. We’d particularly like to see evidence about the effectiveness of:

Let us know (at [email protected]) if you’re interested in experimenting with these approaches – or if you’ve already done so and have results to share.

In any case, it seems clear that survey response rates will always be a challenge, unless the project is very small and you can dedicate a lot of resources to following up. Even a relatively good response rate (say of 60 to 70%) leaves room for doubt about the findings of a study. For that reason, we encourage those designing experiments to consider using alternative sources of outcome data that do not rely on surveys. This might involve using administrative data (that collected in the course of delivering your services, such as numbers of requests for additional support), existing datasets (such as tax data that is routinely collected from businesses), or by generating data from publicly available sources (such as businesses’ websites). In the near future IGL is planning to launch a project to return to some older experiments and use tax data to examine the longer-term outcomes. We’re also currently working to grow IGL’s in-house data science capability, and will have more to share soon on how that will complement our ongoing work on experimental research – including the scope to find more timely measures of outcomes with lower attrition. Watch this space to learn more!