Distrust of experts should not mean abandoning evidence – it’s more important than ever

Dr Max Nathan is Deputy Director of the ESRC-funded What Works Centre for Local Economic Growth. This blog was originally published on the Centre’s blog.

We were in Manchester recently to reflect on the work of the What Works Centre for Economic Growth over the last three years, and to think more broadly about the role of evidence in policy in a post-experts world. We were lucky to be joined by Diane Coyle, who spoke alongside Henry Overman and Andrew Carter on the panel.

Evidence-based policy is more important than ever, Diane pointed out. For cash-strapped local government, evidence helps direct resources into the most effective uses. As devolution rolls on, adopting an evidence-based approach also help local areas build credibility with central government departments, some of whom remain sceptical about handing over power. Greater Manchester’s current devolution deal is, in part, the product of a long term project to build an evidence base and develop new ways of working around it.

A lack of good local data exacerbates the problem, as highlighted in the Bean Review. The Review has, happily, triggered legislation currently going through House of Commons to allow ONS better access to administrative data. Diane is hopeful that this will start to give a clearer picture of what is going on in local economies in a timely fashion, so the feedback can be used to influence the development of programmes in something closer to real time.

Diane also highlighted the potential of new data sources – information from the web and from social media platforms, for example – to inform city management and to help understand local economies and communities better. We think this is important too; I’ve written about this here and here.

Diane also raised some important points about what impacts we can expect certain policies to have, and how we might capture this. Most evaluations can get at the direct effects of a policy, but bigger, system-wide effects are often harder to see. For example, we can track the effects of broadband rollout on firm takeup – but the wider impact of greater connectivity on say, e-commerce – are both more important and harder to get at.

In practice, researchers have made some progress on this in transport policy, where DfT now routinely calculates the wider economic impacts of new roads and rail, not just the direct effects on travel times and commuting. But Diane’s right that we still have some way to go in other policy domains. For evaluators, I think this highlights the importance of having a good theory of change – even if you can’t track all policy impacts, knowing what you’re missing will help you better understand what you can see.

The Q&A covered both future plans and bigger challenges. In its second phase, the Centre will be producing further policy toolkits (building on the training, business advice and transport kits already published). We’ll also be doing further capacity-building work and – we hope – further pilot projects with local partners.

At the same time, we’ll continue to push for more transparency in evaluation. BEIS is now publishing all its commissioned reports, including comments by reviewers. We’d like to see other departments follow suit.

At the Centre, we’d also like to see wider use of Randomised Control Trials in evaluation. Often this will need to involve ‘what works better’ settings where we test variations of a policy against each other – especially when the existing evidence doesn’t give strong priors. For example, Growth Hubs present an excellent opportunity to do this, at scale, across a large part of the country.

That kind of exercise is difficult for LEPs to organise on their own. So central government will still need to be the co-ordinator – despite devolution. Similarly, Whitehall has important brokering, convening and info-sharing roles, alongside the What Works Centres and others.

Incentives also need to change. We think LEPs should be rewarded not just for running successful programmes, but for running successful evaluations, whether or not they work.

Finally, as Diane emphasised, we and other Centres need to keep pushing the importance of evidence, and to as wide a set of audiences as we can manage. Devolution, especially when new Mayors are involved, should enrich local democracy and the local public conversation. At the same time, the Brexit debate has shown widespread distrust of experts, and the ineffectiveness of much expert language and communication tools. The very long term goal of the Centres – to embed evidence into decision-making – has certainly got harder, as Henry suggested. But the community of potential evidence users is getting bigger all the time.