Lessons on designing tech RCTs with SMEs

By Capucine Riom, Anna Valero and Juliana Oliveira-Cunha on Wednesday, 12 January 2022.

How can we help small and medium-sized businesses improve their productivity? New technologies based on artificial intelligence - such as chatbots and marketing automation - seem to offer clear benefits but are not yet widely adopted by SMEs. Alongside the Greater London Authority, Capital Enterprise and CognitionX, we designed a randomised trial to investigate how to encourage SMEs in London's retail and hospitality sectors to adopt the use of these technologies. 

We set out to recruit 500 businesses over two years. Recruitment of SMEs proved difficult from the outset in 2019, but uncertainty and disruption due to Brexit and then Covid-19 created additional challenges and affected our ability to deliver support and collect data from firms. We eventually managed to recruit 229 firms into the trial, but only 43 firms answered the endline survey.

What we found

Our low sample size limits our ability to present robust estimates of treatment effects. Yet we found that adoption-related activity, such as allocating resources towards exploring the adoption of AI technologies, was higher in the treatment groups than in the control group. 

We also found that firms that received one-to-one expert advice as well as financial support considered that the programme addressed barriers to a greater extent than firms that attended lighter-touch events with tech providers.

The qualitative evidence collected during our interviews also suggests that SMEs valued technological upgrading for their business performance and seemed willing to adopt these tools so long as they proved to be cost-effective. 

What went wrong

Business support programs, especially those targeting SMEs, are notorious for recruitment and attrition difficulties. Our experience was no different: 

  • We quickly observed that the number of firms applying was smaller than expected.  We had to revamp our recruitment strategy and employ a direct phone marketing agency to help with outreach.

  • We experienced challenges in terms of programme take-up, even for firms that signed up to attend our events.

  • The response rate in our follow-up data collection was low. The team had to send out personal emails and make phone calls to every participant to increase the number of survey responses. These efforts had limited success. 

  • Some of the mechanisms we set out to test through our trial, such as peer effects, relied on good attendance. Thus the quality of the intervention was hampered by low sample size.

What can we learn about designing RCTs with SMEs? 

  1. Invest in recruitment and build flexibility into the RCT design to increase sample size

The most time-consuming and expensive recruitment method – direct phone marketing – was by far the most fruitful. Social media campaigns and newsletters circulated through local government or business network channels  had limited success. So what can be done to address this?

  • Plan for a large marketing budget to conduct direct phone recruitment.

  • Ensure that direct phone marketing providers make clear to future participants what they are enrolling for. Some SMEs that were recruited had a poor understanding of the business support on offer in the trial. 

  • Be prepared to make some changes in the RCT design to increase sample size. We moved from one large event with many tech distributors and participants to a cohort design with multiple smaller events. Careful planning is then required to ensure consistency in data collection and in the delivery of business support between cohorts. 

  1. Make events easily accessible to SMEs 

Managers of SMEs, especially in retail and hospitality, are likely to struggle to attend in-person events even if they are free of charge and likely to be beneficial to their business. Some managers who were not able to attend asked us whether the support was also delivered online. Be flexible in how the support is provided, such as allowing for online delivery or the opportunity to access programme content on demand.

  1. Plan for outcome data independent of questionnaires 

An endline questionnaire will likely suffer from attrition and low sample size. Consider whether it’s possible to use sources of outcome data that do not derive from questionnaires. These may be intermediary outcomes measurable online: For example, we combined endline data on AI adoption with visual inspection of SME websites to identify the presence of chatbots and marketing automation tools.

  1. Test the demand for the specific intervention on offer 

The qualitative and quantitative evidence collected during our trial revealed that SMEs in the retail and hospitality sectors were not prioritising investments in the types of technologies we provided advice on. This issue might have been heightened by the need for businesses to respond to the pandemic, perhaps prioritising other technologies. Future trials focused on technology adoption should seek out evidence of demand for the specific technologies at the outset. They should focus on the practical, day-to-day challenges of the adoption journey. Technologies on offer should not only be ‘tried and tested’, but also ‘shovel-ready’ and customisable for businesses.

  1. Unify communications

Our delivery team involved a few different organisations, each of them communicating directly with participants: the Greater London authority ran the programme, CognitionX and Capital Enterprise organised the events, and the research team contacted firms about the surveys. Our qualitative interviews suggested that this was confusing for participants. It is key to ensure that  all communication and scheduling of events with SMEs is harmonised, for example through unified email addresses within the delivery team. It’s worthwhile to think of ways to automate recruitment and communication processes to save time.

Silver Linings 

Despite the limitations of the trial, the lessons learned helped progress evidence-based policymaking in the UK. Our experience on this programme fed into the design of the new Technology Adoption Service on the London Business Hub, launched in March 2021. This free service was built as a searchable online marketplace of technology providers to signpost small businesses to the best-fit technology solutions for their needs. As a result of the lessons we learned, the platform is more focused on shovel-ready and tried-and-tested technologies – such as web-based accounting, cloud-based computing, e-commerce and customer relationship management systems, rather than the more cutting-edge solutions that were the subject of this trial.


Since this blog was first written the outcomes from this experiment have now been covered in more detail in the IGL Working Paper No. 23/01.

This project was funded by the Department for Business, Energy and Industrial Strategy and implemented with the support of Innovate UK and IGL under the Business Basics Programme.

IGL has collated findings from across all of the funded experiments. Emerging findings are discussed in this IGL blog with IGL's full evaluation report now expected to be published in Autumn 2023.