Believing in evidence means being surprised

Published: 25/05/2023

Author: Professor Michael Sanders

Believing in evidence-based policy and practice should be fairly uncontroversial. The idea that what we do, and what government does, should have some backing from evidence, feels kind of straightforward - after all, who wants to be the one to say that our practice should be completely evidence free?

Talking about evidence-based practice is harder – because you will fairly quickly come into conflict with someone who disagrees with you about what constitutes evidence. If you mean that evidence-based practice needs to be supported by a randomised controlled trial (RCT) or similar method for statistically identifying the impact of an intervention, this utterance will barely be out of your mouth before someone tells you that RCTs aren’t suitable outside of medicine; that they’re unethical; that the world is too complex for the simple assumptions of an RCT to be realistic.

There is also a tension at the heart of RCTs of complex interventions, whereby RCTs demand a single, or small number, of primary outcome measures against which success must be measured. This is necessary if we want studies to be robust, and to avoid ‘questionable research practices’ which allow researchers to go on fishing expeditions, testing many outcomes in the hope of finding one for which the results are positive, and which ‘prove’ that an intervention works. Reasonable people might disagree, for any intervention, what the primary outcome should be, or might reject the idea of reducing a complex practice down to a single pass/fail metric. Even among your would-be allies among highly statistically minded economists, you can find a Nobel prize winner to tell you why RCTs suffer from not being statistical enough.

Nonetheless, these debates can mostly be kept civil, few people truly hold extreme views, and you can find your tribe of people who share your methodological views.

Harder still than believing in evidence-based policy and talking about it, is living it. The most important attribute in someone who truly believes in evidence is not advanced degrees, or any particular methodological training – instead, it is a willingness to be surprised. Whatever intellectual framework you strap onto your research question, it must be possible – even likely – that you will discover something that challenges or contradicts your preconceptions.

If all of your research findings align with your own prior beliefs, your own prejudices, your own politics, then either you have a godlike knowledge of the world, or you’re doing something wrong.

In my time I’ve been lucky to be involved in several projects where I really, truly believed that we were making a difference. In all of these cases, we had a good theory, and foundational qualitative and quantitative research showing all the signs we were looking for. Take for example the Social Workers in Schools programme, which I commissioned when working at What Works for Childrens’ Social Care (now What Works for Early Intervention and Children’s Social Care). Initial pilots in Stockport, Southampton and Lambeth showed promising results, and evaluators at Cardiff worked with schools and social workers to develop a robust theory of change. Another example is the Study Supporter programme which I worked on with my former student Bibi Groot. Bibi spent years - her entire PhD - taking the idea of leveraging social support networks to help improve pass rates for GCSEs in Further Education Colleges, working closely with students, parents, and teachers to develop her intervention.

In both of these cases, however, rigorous research found that we weren’t making the difference that we’d hoped for. Across a range of outcomes, Social Workers in Schools was recently found by Dave Westlake and his colleagues at Cardiff not to have any effect on the important social care outcomes it was developed to influence, while our study support programme, despite warm feelings from participants, did not change their likelihood of passing their Maths or English GCSEs. Both findings were a surprise, and a disappointment, because I believed in the interventions we were testing, and because I was, and am, committed to trying to improve things for the young people that I work with. On other occasions, interventions that I was convinced would not work ended up having a bigger impact than I expected.

These surprise findings often feel like a kick in the teeth, after you’ve spent years, and huge sums of money, developing and delivering an intervention, only to find out that it doesn’t work (or worse, that it actively makes things worse). This is partly why living evidence-based policy is hardest – not only do you have to contend with other people’s views – you have to fight a battle in your own head. You need to accept someone else’s analysis. You need not to find too much comfort in your secondary, or exploratory outcomes, which are essential for providing colour and context to the evaluation, but which ultimately, were decided in advance to be less important than the primary outcome.

Just winning that battle isn’t enough. If we’re serious about evidence, we need to celebrate the creation of new, high quality evidence, not just new, exciting results. We must meet those two imposters – victory and defeat – just the same and celebrate what we have learned.

This means singing from the rooftops for null results from well conducted studies, just as we would a positive result from the same thing. It means not viewing money as wasted if it teaches something about the world. Believe me, I know that this is unbelievably difficult, and it can feel disingenuous to praise studies that have disappointed us - but ultimately, when it comes to evidence, this is the only way we will progress.

Professor Michael Sanders

Professor Michael Sanders is a Professor of Public Policy at the Policy Institute Kings College London.