We thought we had a great idea to spur innovation and growth in the informal furniture sector in Kenya: we opened and operated a tool library in the informal furniture district in Nairobi, Kenya. The tool library, aptly named “WorkShop”, offers capital in the form of access to quality, industrial grade tools, as well as skills in the form of training classes from a five-week curriculum on business practices, technical skills and customer management. We also developed a custom app, allowing carpenters to access digital information on material selection, design, manufacturing techniques and business practices.
Being diligent self-skeptics, we weren’t sure if our idea would work, so we decided to test it as rigorously as possible: ~100 randomly chosen woodworkers in Nairobi were offered access to the full suite of WorkShop services, while another ~600 were given access to the app alone and another ~700 were our control group.
It turns out that the idea did spur innovation and had some positive impacts, but that doesn’t mean it was the best time for a study.
It turns out that the idea did spur innovation and had some positive impacts, but that doesn’t mean it was the best time for a study. We were happy to see that our WorkShop programme was a success, but realise that with a few considerations, it could have been even more successful. Along the way, we learned a few practical lessons about launching and evaluating such projects:
- Don’t put too much faith in market research: the biggest challenge with our study is that only ~20 of the ~100 woodworkers who were offered a chance to participate in WorkShop’s programme did so. This was despite intensive market research: for almost 6 months we had been conducting exploratory focus groups, interviews and small surveys with woodworkers to learn about their businesses, assess their needs and gauge interest in the WorkShop. Based on that market research, we thought that participation would be overwhelming, but in actuality the response was much different. For the few that did attend, they were quite passionate and diligent at spending the time working with us – this is likely why we were able to find results despite our low take up. We learned that what people say and what people do can often be polar opposites.
- Any hassle is a big hassle: as we explored the very low-take up, we were reminded of the power of small frictions: some woodworkers in the nearby zone were dissuaded from using the tools by the need to transport materials to and from the shop. Note that many of these woodworkers were currently using hand tools, and that access to industrial-grade machinery would both save time and provide a new skill. But these small upfront frictions were enough to keep people away.
- Credit constraints exist: even though many woodworkers perceived the offer to attend the training and use tools as valuable, some were simply unable to take time away from immediate work to attend a few hours of classes a day due to liquidity constraints. We learned that the opportunity cost for people is high, and that time is not always indivisible. Losing out on work for a few hours, might mean losing out on work all day.
- A RCT is a snapshot – time your exposure: though testing impact is critical, it’s wise to take measure at the right time. We had high expectations for take-up, but reality proved otherwise. Though we still were able to observe some results, we might have learned more with a larger sample in the treated group, and, in retrospect, would have run a few pre-pilot sessions.