Technical Note Evaluation Matrix
Technical Note Evaluation Matrix
Technical Note Evaluation Matrix
Technical Note
Evaluation Matrix
1 2 3 4
Introduction What is an Evaluation What does an Evaluation How is the Evaluation
Matrix? Matrix look like and Matrix used?
what does it include?
1. Introduction
1. The purpose of this Technical Note (TN) is to clarify the purpose and content of an evaluation matrix and
provide guidance on when and how to use it. It is intended for the evaluation team that is responsible for
developing the evaluation matrix as part of the inception report. The note is also relevant for the Evaluation
manager (EM) who will be responsible for quality assuring the inception report. The evaluation matrix is
needed for all types of WFP evaluations.
3. The Evaluation Matrix serves as an organizing tool to help plan the conduct of the evaluation, indicating where
secondary data will be used and where primary data will need to be collected. It guides analysis, ensures that
all data collected is analysed and triangulated and supports the identification of evidence gaps. As such, the
Evaluation Matrix ensures that the evaluation design is robust, credible (reducing subjectivity in the evaluative
judgement) and transparent.
1
A pre-analysis plan (PAP) sets out the hypothesis to be tested and specifies the methods of analysis and measurement
approaches used. The PAP should be registered in a recognised repository (such as the American Economic Association’s registry
for randomized controlled trials, or the Registry for International Development Impact Evaluaitons (RIDE)) before the start of an
impact evaluation. In WFP, each impact evaluation window has a ‘window level’ pre-analysis plan that guides the development of
individual impact evaluation PAPs.
October 2020
3. What does an Evaluation Matrix look like and what does
it include?
4. The table below presents an example of an Evaluation Matrix. While the template should be followed, the
content should be contextualized and adapted to each evaluation. To develop the matrix, the evaluation team
will list the evaluation questions, break them down into sub-questions and for each one, identify what data
will be collected to answer the questions, which data collection methods will be used, from which sources,
how the data will be analysed and assess the strength of the evidence. Additional examples can be found in
the annexes to WFP Evaluation Reports, available on: http://www.wfp.org/evaluation
2
Evaluation Question 1 Example: How appropriate was the intervention? Criterion: Relevance
Sub-questions Indicators Data Collection Methods Main Sources of data/ Data Analysis Methods/ Data availability/
Information Triangulation reliability
These are taken from the TOR The indicators and Covers what detailed data Sets out where the Documents how all data Strength of
(explicitly or implicitly) and measures determine how collection methods will be evaluation team will get that is collected is analysed evidence for each
further refined based on new performance or progress used to collect the required information and data to to ensure they can answer evaluation
is judged for each sub- data and information for answer each question. the evaluation questions. It question. Can be
information and discussions
question. Indicators each question. This can This stage is critical to helps to avoid collecting recorded as colour
with stakeholders during the
should be realistic in include, quantitative informing the evaluation data that is not useful, and coding (green/
Inception Phase. terms of data collection beneficiary surveys; key design. clearly shows how data is amber/red); or
Evaluation questions and sub- within the scope of the informant interviews; desk Data sources should be triangulated. This can numerically
questions should apply and be evaluation (see below). review etc. realistic in terms of include, regression (3=strong, 2=fair,
well-specified or relevant to Measures/indicators Although there are usually primary data gathering analysis, statistics, 1=weak), or with
the evaluation objectives or should be clear and various methods for each qualitative analysis. narrative
measurable (either question, data collection Example: Analytical methods should descriptors
purpose.
qualitatively or should be systematically 2015 WFP VAM analyses be appropriate to use for (strong, fair,
They should relate to the quantitatively) and mapped back to the the given data that is weak).
overarching evaluation correspond to the evaluation questions that Government needs collected. Data analysis
analysis study (2014)
question and be developed at evaluation question or were asked (or vice versa). should be systematically Example:
a level that is helpful to sub-question. mapped back to the 3 (strong)
UNHCR evaluation 2015
provide direction to the Example: evaluation questions that
Example: Desk review using a were asked (or vice versa).
evaluation, and not to a level Data from key Informant
Stakeholder perceptions structured framework
interviews with:
for a questionnaire / field regarding the degree to Co-operating partners Example:
instrument. which needs of different Key Informant Interviews
Ministry representatives Narrative/thematic analysis
groups were identified partner UN + donor of secondary data
Sub-questions relating to
appropriately; and Focus groups with
representatives
gender equality and women targeting was done based beneficiaries
Discourse analysis of
empowerment should be on needs Data from beneficiary primary data (interviews/
included, or a justification of focus groups (held focus groups)
% of beneficiaries who say separately with women
why these are not included
that service met their and girls’ beneficiaries) Data disaggregation
needs (by group) (women/vulnerable groups)
Example:
Were the activity’s targeting and The degree to which
transfer modalities appropriate beneficiaries feel/perceive
to food security and nutrition that the service was
needs of women, men, boys tailored to their needs
and girls from different groups?
3
4. How is the Evaluation Matrix used?
5.The Evaluation Matrix is one of the key products of the inception phase. It is developed by the Evaluation Team
once evaluation questions are reviewed and confirmed and available secondary sources are taken into
consideration/compiled and quality checked, in line with the focus of the evaluation questions and the evaluability
limitations. It also allows to clarify expectations between the EM and the evaluation team, improving the
transparency of the evaluation process. The Evaluation Manager (EM) should ensure that the evaluation team is
using and following the agreed evaluation matrix throughout the data collection and reporting phases to guide
data collection, analysis and report writing.
6. When developing the evaluation matrix it is important to understand how different methods and types of data
will be combined to answer different questions, how different data sources will be used to answer the same
evaluation question, and how any triangulation will be undertaken. The Evaluation Matrix is also useful to review
the design in light of gender and wider equity dimensions to ensure that the perspectives or concerns of different
population groups including marginazed groups will be considered. Table 1 describes how the Evaluation Matrix
should be used during each evaluation phase.
Phase 1: N/A
Planning
➢ The EM is responsible for formulating clear and relevant evaluation questions, linked to the
Phase 2:
appropriate evaluation criteria, in line with the purpose, objectives and intended use of the
Preparation
evaluation, as well as with the intervention Theory of Change (ToC) if it exists.
➢ The evaluation team is responsible for developing the Evaluation Matrix at inception phase
Phase 3:
based on the evaluation questions, the proposed methodological approach in the TOR and the
Inception
ToC. If no ToC was elaborated during the design of the intervention, the evaluation team is
expected to reconstruct the ToC and validate it through stakeholders’ consultations during the
inception phase.
➢ The evaluation team refines and finalises the evaluation questions and expands them with
sub-questions as needed. It then develops an appropriate evaluation and analytical approach
for the evaluation. This implies selecting appropriate quantitative indicators or/and qualitative
analysis dimensions, data collection tools and analytical methods for each evaluation
question. This should be documented systematically in the Evaluation Matrix.
➢ The Evaluation Matrix should be included in an annex of the inception report. It is
complemented by other tools such as data collection questionnaires and protocols, field
mission plans, etc.
➢ The EM checks the quality of the Evaluation Matrix when reviewing the draft Inception Report
and ensures that it provides:
o A breakdown of the main questions into an adequate number of sub-questions in such a way that
it enables a systematic assessment against the evaluation questions, keeping the evaluation
focused to attain depth of analysis in line with the evaluation purpose/objectives;
o An overview of how each of the evaluation questions and evaluation criteria will be addressed,
including GEWE dimensions;
o A set of indicators explicitly referring to the ToC used;
o Specific data collection methods;
o All relevant sources of information, specifying whether secondary data will be used and where
primary data is needed;
o An overview of how triangulation will take place;
o Reference to the availability and reliability of the data.
4
➢ The evaluation team collects primary and secondary data to measure the quantitative
Phase 4:
indicators and/or assesses the qualitative analysis dimensions that have been identified, using
Data
the methods and tools agreed in the evaluation matrix. The evaluation team subsequently
Collection
analyses the collected data and information to address the evaluation questions, using the
analytical methods documented in the evaluation matrix.
➢ As the team collects and analyses data, it assesses the quality and availability from different
sources and updates the information on availability and reliability of the evidence collected in
the evaluation matrix. Any changes from the Evaluation Matrix when collecting data should be
agreed with the Evaluation Manager and documented explicitly.
➢ The Evaluation Matrix is used by the evaluation team to inform analysis, including any
Phase 5: Data
triangulation. Any changes from the evaluation matrix when analysing the data should be
Analysis and
documented. Findings and conclusions should be set out against the evaluation questions and
Reporting
follow systematically from the data collected and analysis. The evaluation matrix should be
included as an Annex to the final evaluation report.
➢ The Evaluation Manager reviews and comments on the draft report using the evaluation
matrix as a reference point. More specifically, s/he checks that all the evaluation questions
have been addressed, evidence has been collected, analysed and triangulated as proposed in
the Evaluation Matrix.
5. Further reading
• Linda G. Morra Imas, Ray C. Rist (2009) “The Road to Results: designing and conducting effective
development evaluations.” The World Bank, Washington.
• Ray Rist and Linda Morra Imas IPDET Training 2010. Module 6: Developing Evaluation Questions & Starting
the Design Matrix and Module 7: Selecting Designs for Cause-and-Effect, Normative, and Descriptive