Development action with informed and engaged societies
After nearly 28 years, The Communication Initiative (The CI) Global is entering a new chapter. Following a period of transition, the global website has been transferred to the University of the Witwatersrand (Wits) in South Africa, where it will be administered by the Social and Behaviour Change Communication Division. Wits' commitment to social change and justice makes it a trusted steward for The CI's legacy and future.
 
Co-founder Victoria Martin is pleased to see this work continue under Wits' leadership. Victoria knows that co-founder Warren Feek (1953–2024) would have felt deep pride in The CI Global's Africa-led direction.
 
We honour the team and partners who sustained The CI for decades. Meanwhile, La Iniciativa de Comunicación (CILA) continues independently at cila.comminitcila.com and is linked with The CI Global site.
Time to read
2 minutes
Read so far

Broadening the Range of Designs and Methods for Impact Evaluations

0 comments
Affiliation

Department for International Development (DFID)

Date
Summary

"Impact Evaluation (IE) aims to demonstrate that development programmes lead to development results, that the intervention as cause has an effect."

This report brings together the findings and conclusions of a study on IE, commissioned by DFID. This study is intended to broaden the IE methods and designs.

Excerpts from the Executive Summary:
"...Accountability for expenditure and development results is central to IE, but at the same time as policy makers often wish to replicate, generalise and scale up, they also need to accumulate lessons for the future. Explanatory analysis, by answering the 'hows' and 'whys' of programme effectiveness, is central to policy learning.

IE must... [be] decentralised [and] work... through partnership and where developing countries are expected to be in the lead. These normative principles have practical implications for IE. For example, working through partners leads to multi-stage, indirect causal chains that IE has to analyse; and using a country’s own systems can limit access to certain kinds of data.... Three elements - evaluation questions; appropriate designs and methods; and programme attributes - have to be reconciled when designing IE.

Demonstrating that interventions cause development effects depends on theories and rules of causal inference that can support causal claims. Some of the most potentially useful approaches to causal inference are not generally known or applied in the evaluation of international development and aid....

Designs need to build on causal inference approaches each of which have their strengths and weaknesses, one of the reasons that combining designs and methods - so called 'mixed methods' - are valuable. Combining methods has also become easier because the clear distinctions between quantitative and qualitative methods have become blurred, with quantitative methods that are non-statistical and new forms of within-case analysis made easier by computer aided tools.

...The study has concluded that most development interventions are 'contributory causes'. They 'work' as part of a causal package in combination with other 'helping factors' such as stakeholder behaviour, related programmes and policies, institutional capacities, cultural factors or socio-economic trends. Designs and methods for IE need to be able to unpick these causal packages.

...It is often more informative to ask: 'Did the intervention make a difference?' which allows space for combinations of causes rather than 'Did the intervention work?' which expects an intervention to be cause acting on its own.

...Tailored evaluation strategies are needed to respond to these attributes [duration and time scale; nonlinearity and unpredictability; local customisation of programmes; indirect delivery through intermediate agents such as funds; multiple interventions that influence each other]....A reality that often has to be faced in IE is that there is a trade off between the scope of a programme and strength of causal inference. It is easier to make strong causal claims for narrowly defined interventions and more difficult to do so for broadly defined programmes. The temptation to break programmes down into sub-parts is therefore strong, however this risks failing to evaluate synergies between programme parts and basing claims of success or failure on incomplete analysis....Results monitoring [for long-term programmes] may need to be prioritised alongside a staged evaluation strategy able to respond to changes in implementation trajectories not anticipated at programme launch.

Quality Assurance (QA) for IE is needed both to reassure policy makers that evaluation conclusions are defensible and to reinforce good practice within the evaluation community...Standards such as validity, reliability, rigour and transparency have therefore been incorporated into a three part QA framework covering: the conduct of the evaluation; the technical quality of methods used and normative aspects appropriate to IE in an international development setting."

A chart on page 126 summarises several of the arguments presented in this paper.

Source

Email from Bojan Radej to The Communication Initiative on July 24 2012. Image credit: Social Innovation in Europe (SIE) website