What We Have Learned From Our Evidence Reviews
Henry OvermanHenry Overman is Principal Investigator of the ESRC-funded What Works Centre for Local Economic Growth. Set up in September 2013, the programme has focused on developing a comprehensive evidence base on the effectiveness of different interventions and policies on local economic growth. In this blog, originally published on the What Works Centre’s webpage, he gives an overview of their findings.
With the last of our full evidence reviews now published it seems like a good moment to reflect on what I think we can learn from the full set of reviews.
In terms of overall policy effectiveness a few things stand out. First, there are two particular policy areas – estate renewal and sports and culture – where our evidence reviews suggest that spending tends to have a limited impact on the local economy in terms of improving income or employment. As we stressed in both reports, there are plenty of other reasons for undertaking both these policies. And in both areas it’s possible to believe that there are other more indirect links (e.g. there are economic benefits from reducing obesity). But our reviews suggest it’s more difficult to justify these from a local economic development viewpoint.
Second, and more positively, our remaining reviews suggest that deep scepticism about policy ineffectiveness across the board is not justified. In just about every evidence review that we have done, around half of the high quality evaluations that we reviewed find positive effects on local economic outcomes.
Of course, this overall finding hides some interesting variation across both policy areas and outcomes. For example, from the limited evidence available it appears that business advice is more likely to improve firm level productivity than it is to increase employment, while (perhaps surprisingly) the opposite appears to be the case for firms that receive support for R&D in the form of grants or loans. The impact of policies can also be heterogenous – for example our review of broadband suggested that high skilled workers (and firms employing a lot of these workers) appeared to be more likely to benefit than low skilled workers and low skill-intensive businesses.
Unfortunately, this overall finding isn’t all good news because it means that around 50% of the evidence we’ve reviewed suggests no positive effects on the outcomes we care about. And even when effects are positive, they may not be large enough to justify the cost of the intervention. This raises the natural question about whether we can say anything about what works better. That is, are there aspects of policy design that improve policy effectiveness?
Most of our reviews are able to say something in this regard. For example, our access to finance review found some evidence that loan finance programmes were more likely to be successful than equity finance programmes; our review of Enterprise Zones suggested that local employment conditions could influence the extent to which employment effects were felt locally; and both our apprenticeships and employment training reviews pointed to the importance of employer involvement in improving policy success rates.
But these findings only scratch the surface in terms of the questions that practitioners ask themselves when designing specific policy interventions. Our next strand of ‘toolkit’ work is looking at a somewhat broader evidence base to see what more we can say about these questions. But it’s already clear from that work that the existing evidence can only take us so far.
As we point out time and again in our reviews, improving our understanding of what works for local economic growth requires a wholesale change in the way in which these policies are implemented. We need to undertake far more high quality evaluations of the large amounts that we spend in each of these areas. We also need to place far more emphasis on experimentation, piloting and evaluation at the early stages of policy development and on ensuring that lessons learned feedback into decisions about scaling up (or down) and in improving policy design. We are building a series of demonstrator projects, with local partners, which will showcase these approaches.
In a relatively short period of time, we are some way further forward in our understanding of the likely impact of a variety of local economic policies. We also have a better understanding of how some changes to policy might increase effectiveness.
But the hard work starts here. We need to continue to develop the evidence base, to make sure that the lessons from our reviews are feeding in to policy development and to further expand the evidence base by supporting and encouraging effective evaluation of current and future interventions.