In our previous blog post, we were talking to Doug Scott, Chair of Cavendish Enterprise, about what he learned from leading a randomised trial of the Business Boost scheme, carried out under the UK Government’s Business Basics Programme. Today we continue our conversation with Doug, this time focusing on how funders can best manage experimentation funds and ensure that they produce learning that leads to better policy decisions.
IGL: What should funders be looking for when reviewing proposals to carry out experiments of business support interventions?
Doug Scott: Of course it’s essential that the team have identified an intervention that is promising and that will be of interest to policymakers if it’s successful. But another crucial point is that bidders should be genuinely interested in learning about whether (and how) their intervention works. Running a randomised trial means engaging in a lot of technical details and finding ways to deal with the difficulties in implementation – so it’s very important that the team are committed enough to fully engage in this. On the other hand, it’s difficult to assess the team’s commitment in a conventional application process. Perhaps the best approach is to ask teams open questions about the reasons for putting forward their proposal, and listen out for indications of their motivation in the response.
Is there anything else that could be improved in the selection process to ensure that the team will be equipped to carry out a trial?
Traditional bidding processes tend to be very structured and regimented, but in the end you will be relying to a great extent on applicants’ own motivation and their good will to implement what they have promised, to the best of their ability. It may be worth exploring ways to make the application process more collaborative. For example, for the third round of the Business Basics Fund, an expression of interest stage was introduced, which allowed more room for discussion and negotiation over what the potential applicants had to offer.
What about during the delivery phase itself? What do funders need to be aware of to enable a trial to succeed?
In the Business Boost trial, we found that we needed to make various small adjustments as our plans developed, either to resolve practical issues of implementation or to find alternate strategies when recruitment or compliance were not as high as hoped. It’s a common theme in trials of business support programmes that some testing and adjusting along the way will be required. But this means that the contracting and project management processes should be flexible enough to take account of this. This requires a readjustment from the traditional view that contracts should specify the deliverables and milestones to a high level of detail.
Given your experience with the Business Boost trial, how do you feel now about experimentation and RCTs as a tool for impact evaluation?
I’d say that you should take the results of a randomised trial seriously but not literally. A trial can produce very interesting insights, but you shouldn’t blindly expect the findings to generalise to other contexts. To understand whether and how to scale up interventions that have been successful in a trial, we need to scrutinise which of the moving parts were critical, which contextual factors were important, and how exactly the intervention was carried out in practice.
We’re also much more likely to be able to put the findings of a trial to use if there’s a route to speedy scale-up. But this presents an important challenge for using RCTs in policy design: there’s a trade-off between having results available as quickly as possible and letting the trial run for more time so as to look for longer-term impacts. In the past some trials haven’t got this right, producing very informative and robust results, but only after people’s priorities have moved on and the window for influencing policy had closed. Striking the balance between producing valuable insights and making sure that those insights are available quickly will always be key to maximising the value from experiments like this.
Read our previous blog post to find out what Doug learned from leading a randomised trial of the Business Boost scheme, carried out under the UK Government’s Business Basics Programme.
You can also read the Working Paper here.