Generating demand for and use of evaluation evidence in government health ministries: lessons from a pilot programme in Uganda and Zambia

The Demand-Driven Evaluations for Decisions (3DE) programme was piloted in Zambia and Uganda in 2012-2015. It aimed to answer evaluative questions raised by policymakers in Ministries of Health, rapidly and with limited resources. The aim of our evaluation was to assess whether the 3DE model was suc...

Full description

Saved in:
Bibliographic Details
Published in:Health research policy and systems 2017-10, Vol.15 (1), p.86-86, Article 86
Main Authors: Witter, Sophie, Kardan, Andrew, Scott, Molly, Moore, Lucie, Shaxson, Louise
Format: Article
Language:eng
Subjects:
Online Access:Get full text
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:The Demand-Driven Evaluations for Decisions (3DE) programme was piloted in Zambia and Uganda in 2012-2015. It aimed to answer evaluative questions raised by policymakers in Ministries of Health, rapidly and with limited resources. The aim of our evaluation was to assess whether the 3DE model was successful in supporting and increasing evidence-based policymaking, building capacity and changing behaviour of Ministry staff. Using mixed methods, we compared the ex-ante theory of change with what had happened in practice, why and with what results (intended and unintended), including a qualitative assessment of 3DE's contribution. Data sources included a structured quality assessment of the five impact evaluations produced, 46 key informant interviews at national and international levels, structured extraction from 170 programme documents, a wider literature review of relevant topics, and a political economy analysis conducted in Zambia. We found that 3DE had a very limited contribution to changing evidence-based policymaking, capacity and behaviour in both countries as a result of having a number of aspirations not all compatible with one another. Co-developing evaluation questions was more time-consuming than anticipated, Ministry evidence needs did not fit neatly into questions suitable for impact evaluations and constricted timeframes for undertaking trials did not necessarily produce the most effective results and value for money. The evaluation recommended a focusing of objectives and a more strategic approach to strengthening evaluative demand and capacity. Lessons emerge that are likely to apply in other low- and middle-income settings, such as the importance of supporting evaluative thinking and capacity within wider institutions, of understanding the political economy of evidence use and its uptake, and of allowing for some flexibility in terms of programme targets. Fixating on one type of evidence is unhelpful in the context of institutions like ministries of health, which require a wide range of evidence to plan and deliver programmes. In addition, having success tied to indicators, such as number of 'policy decisions made', provides potentially perverse incentives and neglects arguably more important aspects such as incremental programmatic adjustments and improved implementation.
ISSN:1478-4505
1478-4505