Evaluation Design For Complex Global Initiatives Workshop Summary 1st Edition Institute of Medicine All Chapters Instant Download
Evaluation Design For Complex Global Initiatives Workshop Summary 1st Edition Institute of Medicine All Chapters Instant Download
Evaluation Design For Complex Global Initiatives Workshop Summary 1st Edition Institute of Medicine All Chapters Instant Download
com
https://ebookmeta.com/product/evaluation-design-for-complex-
global-initiatives-workshop-summary-1st-edition-institute-
of-medicine/
OR CLICK BUTTON
DOWLOAD NOW
https://ebookmeta.com/product/collaboration-between-health-care-
and-public-health-workshop-summary-1st-edition-institute-of-
medicine/
https://ebookmeta.com/product/spread-scale-and-sustainability-in-
population-health-workshop-summary-1st-edition-institute-of-
medicine/
https://ebookmeta.com/product/global-health-risk-framework-
governance-for-global-health-workshop-summary-1st-edition-and-
medicine-engineering-national-academies-of-sciences/
https://ebookmeta.com/product/global-health-risk-framework-
pandemic-financing-workshop-summary-1st-edition-and-medicine-
engineering-national-academies-of-sciences/
https://ebookmeta.com/product/exploring-shared-value-in-global-
health-and-safety-workshop-summary-1st-edition-and-medicine-
engineering-national-academies-of-sciences/
For more information about the Institute of Medicine, visit the IOM
home page at: www.iom.edu.
The serpent has been a symbol of long life, healing, and knowledge
among almost all cultures and religions since the beginning of
recorded history. The serpent adopted as a logotype by the Institute
of Medicine is a relief carving from ancient Greece, now held by the
Staatliche Museen in Berlin.
www.national-academies.org
PLANNING COMMITTEE FOR THE WORKSHOP
ON
EVALUATING LARGE-SCALE, COMPLEX,
MULTI-NATIONAL GLOBAL HEALTH
INITIATIVES1
IOM Staff
BRIDGET B. KELLY, Project Co-Director
KIMBERLY A. SCOTT, Project Co-Director
KATE MECK, Associate Program Officer
CHARLEE ALEXANDER, Senior Program Assistant (from November
2013)
JULIE WILTSHIRE, Financial Officer
PATRICK W. KELLEY, Senior Board Director, Board on Global
Health
_____________
1 Institute of Medicine planning committees are solely responsible for organizing
the workshop, identifying topics, and choosing speakers. The responsibility for the
published workshop summary rests with the workshop rapporteur and the
institution.
Reviewers
REFERENCES
APPENDIXES
A STATEMENT OF TASK
B WORKSHOP AGENDA
C PARTICIPANT BIOGRAPHIES
D EVALUATION INFORMATION SUMMARY FOR CORE EXAMPLE
INITIATIVES
E EVALUATION DESIGN RESOURCES HIGHLIGHTED AT THE
WORKSHOP
1
Understanding Context
The evaluators emphasized the critical importance for evaluation
design of understanding the relationship between context and the
desired outcomes for intervention and evaluation designs. Sridharan
noted it is best to bring the knowledge of context in at the start, but
also to understand how it is evolving and adapting over time. Mary
Bassett of the Doris Duke Charitable Foundation Evaluators
reiterated the need to devote heightened attention to context,
referring to how Elliot Stern in his comments “really challenged us to
unpack the notion of what context means.” Contextual issues arise
on micro-, meso-, and macro-scales, and all can be important.
Droughts, economic crises, and political changes are some factors
that can affect the outcome of an initiative and should be tracked,
but it is also important to think about how to understand issues of
leadership, power, trust, communication, and community
engagement that have all been talked about, Bassett said.
Evaluators
Funders
Create incentives for learning.
Make reasonable demands of evaluators, and fund at the
right level.
Permit or require transparency.
_____________
1 The planning committee’s role was limited to planning the workshop. The
workshop summary has been prepared by the workshop rapporteur (with the
assistance of Charlee Alexander, Bridget Kelly, Kate Meck, and Kimberly Scott) as a
factual summary of what occurred at the workshop. Statements,
recommendations, and opinions expressed are those of individual presenters and
participants; they are not necessarily endorsed or verified by the Institute of
Medicine, and they should not be construed as reflecting any group consensus.
2
After agreeing with the points that the previous speakers had
made, Christopher Whitty, chief scientific advisor at the United
Kingdom’s Department for International Development (DFID), noted
that, in his view, program officials who work outside of the health
care arena have not historically had much appreciation for the fact
that “good ideas, passionately delivered by people to the highest
quality, may not work.” As a result, outside of health care, not much
value has been placed on evaluation, though he acknowledged that
this situation is changing for the better. Other positive developments,
he said, include the improvement in the methodologies available for
performing complex evaluations and greater acceptance that mixed
methods approaches, or using both quantitative and qualitative
methods for data collection and analysis, are important for
evaluations.
In his role as a commissioner at DFID, Whitty is on the receiving
end of evaluations and has seen a number of outstanding
evaluations over the past few years, including the independent
evaluation of the Affordable Medicines Facility–malaria (AMFm). Most
evaluations, however, have not been outstanding, and he observed
that some of the reasons are on the side of those who request and
fund evaluations. The biggest problem from the donor perspective,
he said, is that those who commission evaluations have multiple
goals for the evaluation that, while not necessarily incompatible,
actually require distinctly different approaches. One goal is to
provide assurance to those who pay for a given program—the British
public in his case—that their money is not being wasted. A second
goal is to check on the efficacy of a program and make course
corrections if needed. The third goal is impact evaluation—what
about a program has worked and what has not, what has been cost-
effective and what has not, and what aspects can be improved in the
next iteration of the program?
The problem, said Whitty, is that those who ask for evaluations
often are asking for a single evaluation that meets all three goals at
the same time. “If someone asks you for all three, you have to tell
them that they are different things and they are going to have to
pay more and probably have to do it by at least two different
mechanisms.” This discussion has to take place up-front between the
person who would do the evaluation and the person commissioning
an evaluation to avoid wasting time and money on pointless
activities, he added. Another confounding factor is that most of the
large, complex programs are conceived by what Whitty characterized
as “very smart, very politically connected, and very charismatic true
believers.” The resulting political realities have to be considered in
the initial design discussions between funders and evaluators.
On the side of the evaluators, poorly performed evaluations are
often the result of the difficulty of evaluating complex programs.
“What we are talking about here is intrinsically difficult. Many of
these things are really hard to evaluate.… I do not believe there is
such a thing as perfect design for most of the things we are talking
about in this meeting.” He described assessing whether a design is
poorly conceived based on whether he would change his mind about
a program if the evaluation did not provide the answer he expected
or desired. If the evaluation is not “sufficiently strong
methodologically that you are forced to change your mind,” he
stated, then “you probably should not do the evaluation in the first
place. That seems to me to be a common sense test.” Another
problem that he sees on the delivery side is that evaluations of
complex programs require teams comprising individuals with a wide
range of skills, and assembling such multidisciplinary teams is
difficult. How to facilitate the formation of multidisciplinary teams is
“something that we as donors as well as providers need to think
through,” he said.