In a recent post, we highlighted the findings from the randomised trial of the Business Boost project, carried out by Cavendish Enterprise in collaboration with the Enterprise Research Centre. Since this was the first randomised trial to be completed under the UK Government’s Business Basics Programme, it was a learning process for the implementers, the evaluators, and for us at the Innovation Growth Lab (IGL). Together with a colleague from the UK Department for Business, Energy and Industrial Strategy, we recently chatted with Doug Scott, Chair of Cavendish Enterprise and the instigator of the Business Boost trial, to find out what he had learned during the process.
IGL: What first motivated you to take part in the Business Basics Programme and carry out a randomised trial?
Doug: We were very curious about the trial methodology - we’d seen interesting results come out of the RCT of the Growth Vouchers programme, and we wanted to know more. Although we were confident that our programmes were having a positive impact, our budget limitations have meant that the evaluations we’ve commissioned in the past have not been as rigorous or as convincing as we’d have hoped. The Business Basics Fund gave us an opportunity to make a robust assessment of the impacts of this programme.
And were the results in line with your expectations?
Yes, more or less. The RCT found good evidence that the programme results in changes – increases in awareness and adoption of some management practices and tools – that we believe are important first steps towards SMEs increasing their productivity. The limited time frame of the RCT meant that we weren’t able to assess to what extent those changes in productivity have come about. But there’s always a trade-off between wanting to produce evidence that we can put to use in the short term, versus wanting to demonstrate longer-term impacts.
Was there anything that surprised you about the process of implementing a randomised trial?
We realised early on that we had underestimated the level of rigour and the complexity involved in carrying out an RCT. At the planning stage we had to spend more time than expected going through the details: how exactly we would implement the programme and randomise which businesses could participate, what the expected impacts were and how to measure them, and much else. And we also found that the workload wasn’t constant – there were short bursts of activity at particular times, which could be challenging to manage alongside our existing workload.
At the start of the process we were concerned that business owners would object to being randomly selected to participate in the Business Boost programme, and that this would negatively affect our good relationships with them. But in this respect we were pleasantly surprised. We were able to be transparent with the participants that this was a research project, that not all businesses would be able to participate during the testing phase, and that whether the intervention would be rolled out further depended on the results.
That’s interesting, as a lot of trials that IGL has supported have had difficulty in encouraging sufficient numbers of SMEs to sign up. Do you have any tips for successfully reaching and recruiting businesses?
It’s true, recruitment was not easy. We started by leveraging our existing networks, offering the opportunity to businesses that had already worked with Cavendish Enterprise or with our partners. But we also found that we needed to be flexible (for example, in the profile of businesses we were targeting) and had to iterate to find an approach that would work. It took some time to get all of this right.
One thing that’s often missed in the rush to recruit participants for a trial is that quality matters as much as quantity. It’s very important that those who sign up are genuinely enthusiastic about the programme, so that they participate fully in the activities and have the potential to benefit.
What advice do you have for others for putting together a team to run an RCT?
It’s crucial that there is a good relationship between the programme implementation staff and the evaluators. You’ll need to be able to have frank conversations and come to an agreement quickly if there are hold ups or if you encounter unexpected problems. The delivery team need to have a good level of curiosity about RCTs, and the evaluation team needs to be aware of and sensitive to the practicalities of delivering the intervention.
What other advice do you have for business-support organisations who may be considering getting involved in an RCT?
Understand that the funder’s interest is primarily in the research and what they can learn from it, rather than in the implementation of the programme itself. This requires quite a shift in mindset. As an implementer you may feel that the worst-case scenario is that the programme is found not to have an effect. But in a programme like Business Basics, the worst case is that the results are not robust enough to tell us whether it worked or not.
Read our next blog post to find out more about Doug’s suggestions on what funding agencies can learn from this trial about the design of experimentation funds.
You can also read the Working Paper here.