Quantcast
Channel: Stephen Tall » tim harford
Viewing all articles
Browse latest Browse all 9

‘One Hundred Days for Early Action’. Why evidence-based policy is a ‘Yes, but…’ way forward

$
0
0

early actionAlongside folk like Polly Toynbee, Gus O’Donnell and Richard Layard, I have a chapter in One Hundred Days for Early Action: Time for Government to put prevention first, published this weekend by the Early Action Task Force.

Here’s how the blurb trails it:

As the UK faces a situation of escalating need and diminishing resource, with public expenditure cuts forecast into at least the next four years, this collection of essays by expert commentators develops practical recommendations which a newly-elected Government could adopt in their first 100 days in office to put an early action approach at the centre of their work, or risk the future of important public services.

My offering’s called ‘Early Action Evidence: its Limitations and Opportunities’. I’ve written it in a personal capacity, but obviously it’s informed by my day-job at the Education Endowment Foundation, where we fund trials in schools of evidence-based ideas designed to increase the attainment of disadvantaged pupils.

The skinny of my argument is this…

Evidence is really useful, but to be useful we have to realise its limitations. Evidence is not a magic wand or a silver bullet. It’s a really useful starting point, at least if the evidence you have access to is high-quality, but it doesn’t come with guarantees. Too many policy-makers assume that if a research report says something has worked somewhere, it will automatically work everywhere. And it really isn’t that easy. That’s why I think evidence has to go hand-in-hand with (‘scuse the buzzwords) professional ownership and local adaptation. In essence, top-down doesn’t really work; but bottom-up works best if it’s grounded in the evidence of what’s already been tried by your colleagues elsewhere.

Here’s a snippet, in which I appeal to government to take evidence-generation seriously by commissioning proper evaluations…

… across government, evaluation is too often treated as an after-thought. A recent National Audit Office report on the quality of almost 6,000 government evaluations found the strength of evidence to be in inverse proportion to the claims made for the effectiveness of policies: the most positive claims were based on the weakest research. This is graphically illustrated below:

nao evaluations graphic

Civil servants may well argue they take their cue from their political masters. Research by the Institute for Government has highlighted how some politicians may prefer to avoid the ‘inconvenient truths’ of evidence to avoid unpopularity and be unwilling to commit to multi-year evaluations which don’t fit the electoral timetable.

I do not lightly dismiss such difficulties. However, there are two things which government could ensure they do. First, it can look at the evidence that already exists for pointers as to which policies are most likely to work effectively. Secondly, it can ensure when trying out new ideas the pilots are set up in a way that allows their effects to be appropriately evaluated.

As the economist Tim Harford points out:

While randomised trials are not going to tell us when to raise interest rates or get out of Afghanistan, there are many policies that could and should be tested with properly controlled trials. Is Jamie Oliver right to emphasise healthy school meals? Run a trial. Should young offenders be sent to boot camp, or to meet victims of crime? Run a trial. What can we do to persuade households to use less electricity? Run a trial.

The evidence generated will offer us a much more secure basis for informing decision-making across social policy. The risk otherwise is that we over-promise the impact of effective prevention and under-deliver on the reality.

You can download the whole ”One Hundred Days for Early Action’ publication here.


Viewing all articles
Browse latest Browse all 9

Latest Images

Trending Articles





Latest Images