Médecins Sans Frontières (MSF)/Doctors Without Borders is an international medical humanitarian organization determined to bring quality medical care to people in crises around the world, when and where they need regardless of religion, ethnical background, or political view. Our fundamental principles are neutrality, impartiality, independence, medical ethics, bearing witness and accountability.
The Stockholm Evaluation Unit (SEU), based in Sweden, is one of three MSF units tasked to manage and guide evaluations of MSF’s operational projects. For more information see: evaluation.msf.org.
The commitment to evaluation at MSF comes primarily from the La Mancha Agreement (2006) which states not only that MSF aspires to ensure quality, relevance, and extent of operations, and to commits to the impact and effectiveness of its work so that good work can be multiplied, and ineffective practice abandoned. MSF Operational Centre Brussels (OCB) elaborates on this commitment in its 2020-23 Strategic Orientations stating that it wants to develop: a culture of evaluation to give the field teams the opportunity to learn from [their] practices and to constantly improve the quality and pertinence of operational/medical interventions.
MSF does not accept institutional funding from most bilateral donors, removing what is often an impetus for evaluation at other non-governmental organizations. Learning is most often cited as the predominant intention behind wanting to evaluate. This can be related directly to the individual project, future programing in the country or region, to inform advocacy (vis-à-vis for example a country’s ministry of health) as well institutional learning.
For OCB, evaluation is about assessing the design, strategy, implementation, and results of medical and humanitarian interventions, measured against established MSF or international standards (SEU Steering Committee Framework, 2019). A dedicated unit, the Stockholm Evaluation Unit (SEU), manages primarily external evaluations, but does on occasion conduct internal evaluations as well. They cover a range of medical operational topics (i.e., migration, non-communicable disease, HIV/AIDS), and in some cases topics related more to organizational sets-up and strategies.
There is currently no formal adopted framework of quality in the evaluations managed by the SEU on behalf of OCB although the work of the unit is influenced by several frameworks including the Joint Committee on Standards for Educational Evaluation (JCSEE) Program Evaluation Standards, ALNAP Proforma, as well as various evaluator competency frameworks, including those from the American Evaluation Association and the United Nations Evaluation Group (UNEG). It is likely that ideas on what constitutes quality or value for different stakeholders in evaluation within the context of MSF and OCB differ across the organization. It will be necessary to establish a framework of accepted criteria as part of the evaluation process.
This meta evaluation will seek to assess completed evaluations carried out between 2017-2022, that the SEU has managed at the request of or directly and significantly involving OCB. Other entities at OCB do complete their own internal, analytical exercises (i.e., retrospectives) but these would be out of scope for this evaluation.
Purpose and Intended use
The purpose of this meta evaluation is to assess the quality and value of OCB evaluations. The intention is not to evaluate the SEU’s performance but rather the evaluations (individually and collectively) that have been finalized at OCB’s request, and the unit has managed. This should not be a technocratic exercise, based on checklists that review whether specific elements (i.e., inception report) have been included, but rather an analytical exercise that assesses the value of the completed evaluations to OCB, ranging from individual projects to the organization as such.
This meta evaluation should help to build a coherent understanding of what constitutes value and quality of evaluations to OCB. Results should explore factors influencing the value and quality of evaluation, and how these can be increased within the organizational context. Understanding evaluations’ significance can contribute to shed light on their worth. The primary recipients of the meta evaluation are the SEU and the SEU Steering Committee; the secondary recipients are the OCB Board, OCB association and staff.
As stated, the SEU does today not manage a formalized quality framework to define what is quality and value in evaluations at OCB. That said, it is guided by three overarching areas (methods, use and values) that can serve as subheadings for standards.
The proposal must make suggestions of the most appropriate criteria to be used, which will then be elaborated upon and finalized as a part of the inception phase.
1. Inception Report
The inception report ought to include a detailed evaluation proposal including the methodology and evaluation protocol. The IR must elaborate on the evaluand and evaluation questions and include the proposal of criteria to be used to assess quality.
2. Draft Evaluation Report
The draft ER ought to answer to the evaluation questions and will include analysis, findings, and conclusions – and if necessary – lessons learned and recommendations.
3. Working Session
As part of the report writing process, a working session will be held with the commissioner, consultation group members and SEU evaluation manager. The evaluator will present the preliminary findings, collect feedback and facilitate a discussion on recommendations (either to co-create recommendations or, if already developed, their feasibility).
4. Final Evaluation Report
The final report will have addressed feedback received during the working session and written input from the feedback loop.
5. Presentation of the Final Evaluation Report
A presentation of the final report to a general OCB audience in the form of a webinar.
The key deliverables (inception report, draft/final report) will be processed through a feedback loop, collecting input from the consultation group (see below, Practical Implementation of the Evaluation). They are then endorsed by the evaluation’s commissioner.
TOOLS AND METHODOLOGY PROPOSED
In addition to the initial evaluation proposal submitted as a part of the application, a detailed evaluation protocol should be prepared by the evaluators during the inception phase. It will include a detailed explanation of proposed methods and its justification based on validated theories. It will be reviewed and validated as a part of the inception phase in coordination with the SEU.
Evaluations and other evaluative exercises managed by the SEU for OCB 2017-2022
Existing SEU plans, guidelines, and policies
SEU Steering Committee framework (2019)
OCB Strategic Orientation 2020-2023, OCB Strategic Prospects 2020-2023
PRACTICAL IMPLEMENTATION OF THE EVALUATION
Number of evaluator(s)
Timing of the evaluation
start May 2022 – finish September
The SEU and its steering committee will establish a consultation group (CG) to accompany this evaluation. The CG is led by a commissioner. They have contributed to finalizing this ToR.
PROFILE/REQUIREMENTS FOR EVALUATOR(S)
The evaluation requires an individual or team of individuals who can demonstrate competencies in the following areas.
Relevant evaluation competencies, preferably with experience in implementing a meta- evaluation like the one being proposed
a. Professional focus – acts ethically, reflectively, enhances and advances professional practice of evaluation.
b. Technical focus – applies appropriate evaluation methodology.
c. Situational focus – considers and analyses evaluation context successfully.
d. Management focus – conducts and manages evaluation projects skillfully.
e. Communication focus – interacts and communicates successfully with stakeholders.
a. Humanitarian program management, including humanitarian program monitoring and evaluation and/or knowledge management and learning.
b. Fluency in English, spoken and written. French is a benefit.
 The La Mancha Agreement was adopted in Athens, Greece in 2006 following a process of discussion and debate to address internal challenges. https://msf.org/sites/msf.org/files/La%20Mancha%20Agreement%20EN.pdf.
 This constitutes roughly 28 evaluations and other evaluative exercises.
How to apply:
The application should consist of a technical proposal in English, a budget proposal, CV, and a previous work sample. The proposal should include a reflection on how adherence to ethical standards for evaluations will be considered throughout the evaluation. In addition, the evaluator/s should consider and address the sensitivity of the topic at hand in the methodology as well as be reflected in the team set-up. Offers should include a separate quotation for the complete services, stated in euros. The budget should present consultancy fee according to the number of expected working days over the entire period, both in totality and as a daily fee. Travel costs, if any, do not need to be included as the SEU will arrange and cover these. Do note that MSF does not pay any per diem.
Applications will be evaluated based on whether the submitted proposal captures an understanding of the main deliverables as per this ToR, a methodology relevant to achieving the results foreseen, and the overall capacity of the evaluator(s) to carry out the work (i.e., inclusion of proposed evaluators’ CVs, reference to previous work, certification et cetera).
Interested teams or individuals should apply to firstname.lastname@example.org referencing [META] no later than Sunday April 24, 23:59 CET.** We would appreciate the necessary documents being submitted as separate attachments (proposal, budget, CV, work sample and such). Please include your contact details in your CV.
Please indicate in your email application on which platform you saw this vacancy.