IDEAS home Printed from https://ideas.repec.org/a/eee/epplan/v67y2018icp189-199.html
   My bibliography  Save this article

To evaluate or not: Evaluability study of 40 interventions of Belgian development cooperation

Author

Listed:
  • Holvoet, Nathalie
  • Van Esbroeck, Dirk
  • Inberg, Liesbeth
  • Popelier, Lisa
  • Peeters, Bob
  • Verhofstadt, Ellen

Abstract

Due to an increasing importance of evaluations within development cooperation, it has become all the more important to analyse if initial conditions for quality and relevant evaluations are met. This article presents the findings from an evaluability study of 40 interventions of Belgian development cooperation. A study framework was developed focusing on three key dimensions (i.e. theoretical evaluability, practical evaluability and the evaluation context) and subdivided over the different OECD/DAC criteria. Drawing upon a combination of desk and field research, the study framework was subsequently applied on a set of 40 interventions in Benin, DRC, Rwanda and Belgium. Findings highlight that the context dimension scores remarkably better than the theoretical and practical evaluability in particular. The large majority of the interventions have the conditions in place to satisfactorily evaluate effectiveness and efficiency while the opposite holds for sustainability and impact in particular. These findings caution against commissioning of evaluations that ritually focus on all OECD/DAC criteria regardless of their readiness. It underscores the usefulness of a flexible ‘portfolio’ approach towards evaluations, in which a more systematic use of evaluability assessment from the start of interventions could play a role.

Suggested Citation

  • Holvoet, Nathalie & Van Esbroeck, Dirk & Inberg, Liesbeth & Popelier, Lisa & Peeters, Bob & Verhofstadt, Ellen, 2018. "To evaluate or not: Evaluability study of 40 interventions of Belgian development cooperation," Evaluation and Program Planning, Elsevier, vol. 67(C), pages 189-199.
  • Handle: RePEc:eee:epplan:v:67:y:2018:i:c:p:189-199
    DOI: 10.1016/j.evalprogplan.2017.12.005
    as

    Download full text from publisher

    File URL: http://www.sciencedirect.com/science/article/pii/S0149718917301507
    Download Restriction: Full text for ScienceDirect subscribers only

    File URL: https://libkey.io/10.1016/j.evalprogplan.2017.12.005?utm_source=ideas
    LibKey link: if access is restricted and if your library uses this service, LibKey will redirect you to where you can use your library subscription to access this item
    ---><---

    As the access to this document is restricted, you may want to search for a different version of it.

    References listed on IDEAS

    as
    1. Holvoet, N. & Renard, Robrecht, 2007. "Monitoring and evaluation under the PRSP: Solid rock or quicksand?," Evaluation and Program Planning, Elsevier, vol. 30(1), pages 66-81, February.
    2. Lant Pritchett & Salimah Samji & Jeffrey S. Hammer, 2012. "It's All about MeE: Using Structured Experiential Learning ('e') to Crawl the Design Space," WIDER Working Paper Series wp-2012-104, World Institute for Development Economic Research (UNU-WIDER).
    3. Smith, M. F., 1990. "Evaluability assessment: Reflections on the process," Evaluation and Program Planning, Elsevier, vol. 13(4), pages 359-364, January.
    4. D’Ostie-Racine, Léna & Dagenais, Christian & Ridde, Valéry, 2013. "An evaluability assessment of a West Africa based Non-Governmental Organization's (NGO) progressive evaluation strategy," Evaluation and Program Planning, Elsevier, vol. 36(1), pages 71-79.
    5. Rob Lloyd & Derek Poate & Espen Villanger, 2014. "Results measurement and evaluability: a comparative analysis," Journal of Development Effectiveness, Taylor & Francis Journals, vol. 6(4), pages 378-391, December.
    6. Holvoet, Nathalie & Dewachter, Sara, 2013. "Building national M&E systems in the context of changing aid modalities: The underexplored potential of National Evaluation Societies," Evaluation and Program Planning, Elsevier, vol. 41(C), pages 47-57.
    7. repec:idb:brikps:27118 is not listed on IDEAS
    Full references (including those not matched with items on IDEAS)

    Citations

    Citations are extracted by the CitEc Project, subscribe to its RSS feed for this item.
    as


    Cited by:

    1. Holvoet, Nathalie & Casten, Wanda & Demissie, Eshetu Woldeyohannes & Dewachter, Sara & Gamboa, Marian Kaye C. & Adhanom, Tewelde Gebremariam & Ibrahim, Abdurahman Hamza & Makundi, Hezron & Manguni, Gr, 2023. "Theory-based evaluation of the impact of Master’s programmes in development studies: Insights from a mixed-methods and multicultural alumni action research project," Evaluation and Program Planning, Elsevier, vol. 97(C).
    2. Valérie Pattyn, 2019. "Towards Appropriate Impact Evaluation Methods," The European Journal of Development Research, Palgrave Macmillan;European Association of Development Research and Training Institutes (EADI), vol. 31(2), pages 174-179, April.
    3. Shyam Singh & Nathalie Holvoet & Vivek Pandey, 2018. "Bridging Sustainability and Corporate Social Responsibility: Culture of Monitoring and Evaluation of CSR Initiatives in India," Sustainability, MDPI, vol. 10(7), pages 1-19, July.
    4. Valérie Pattyn & Marjolein Bouterse, 2020. "Explaining use and non-use of policy evaluations in a mature evaluation setting," Palgrave Communications, Palgrave Macmillan, vol. 7(1), pages 1-9, December.

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Landiyanto, Erlangga Agustino, 2015. "Transformation of the national monitoring and evaluation arrangement in decentralized Indonesia," MPRA Paper 69073, University Library of Munich, Germany.
    2. Holvoet, Nathalie & Inberg, Liesbeth & Sekirime, Susan, 2013. "Institutional analysis of monitoring and evaluation systems: comparing M&E systems in Ugandas health and education sector," IOB Working Papers 2013.03, Universiteit Antwerpen, Institute of Development Policy (IOB).
    3. Sara Nadel and Lant Pritchett, 2016. "Searching for the Devil in the Details: Learning about Development Program Design," Working Papers 434, Center for Global Development.
    4. Eduardo Levy Yeyati, 2019. "What Works for Active Labor Market Policies?," CID Working Papers 358, Center for International Development at Harvard University.
    5. Nadel, Sara & Pritchett, Lant, 2016. "Searching for the Devil in the Details: Learning about Development Program Design," Working Paper Series rwp16-041, Harvard University, John F. Kennedy School of Government.
    6. Sanou, Aboubakary & Kouyaté, Bocar & Bibeau, Gilles & Nguyen, Vinh-Kim, 2011. "Evaluability Assessment of an immunization improvement strategy in rural Burkina Faso: Intervention theory versus reality, information need and evaluations," Evaluation and Program Planning, Elsevier, vol. 34(3), pages 303-315, August.
    7. Farah Hani & Miguel Angel Santos, 2021. "Diagnosing Human Capital as a Binding Constraint to Growth: Tests, Symptoms and Prescriptions," Growth Lab Working Papers 168, Harvard's Growth Lab.
    8. Hammer, Jeffrey & Spears, Dean, 2013. "Village sanitation and children's human capital : evidence from a randomized experiment by the Maharashtra government," Policy Research Working Paper Series 6580, The World Bank.
    9. Michael Clemens, Gabriel Demombynes, 2013. "The New Transparency in Development Economics: Lessons from the Millennium Villages Controversy," Working Papers 342, Center for Global Development.
    10. Hortigüela Arroyo, María & Ubillos Landa, Silvia, 2019. "Evaluability assessment of a community development leisure program in Spain," Evaluation and Program Planning, Elsevier, vol. 72(C), pages 219-226.
    11. Léna D’Ostie-Racinea & Christian Dagenais & Valéry Ridde, 2021. "Examining Conditions that Influence Evaluation use within a Humanitarian Non-Governmental Organization in Burkina Faso (West Africa)," Systemic Practice and Action Research, Springer, vol. 34(1), pages 1-35, February.
    12. Florent Bédécarrats & Isabelle Guérin & François Roubaud, 2019. "All that Glitters is not Gold. The Political Economy of Randomized Evaluations in Development," Development and Change, International Institute of Social Studies, vol. 50(3), pages 735-762, May.
    13. repec:pri:cheawb:tscjeff2013%20paper is not listed on IDEAS
    14. Nathalie Holvoet & Robrecht Renard, 2007. "Monitoring and Evaluation Reform under Changing Aid Modalities: Seeking the Middle Ground in Aid-Dependent Low-Income Countries," WIDER Working Paper Series RP2007-52, World Institute for Development Economic Research (UNU-WIDER).
    15. Clive Bell & Lyn Squire, 2017. "Providing Policy Makers with Timely Advice: The Timeliness-Rigor Trade-off," The World Bank Economic Review, World Bank, vol. 31(2), pages 553-569.
    16. Arkedis, Jean & Creighton, Jessica & Dixit, Akshay & Fung, Archon & Kosack, Stephen & Levy, Dan & Tolmie, Courtney, 2021. "Can transparency and accountability programs improve health? Experimental evidence from Indonesia and Tanzania," World Development, Elsevier, vol. 142(C).
    17. Michael Woolcock, 2013. "Using Case Studies to Explore the External Validity of ‘Complex’ Development Interventions," CID Working Papers 270, Center for International Development at Harvard University.
    18. Cameron, Lisa & Olivia, Susan & Shah, Manisha, 2019. "Scaling up sanitation: Evidence from an RCT in Indonesia," Journal of Development Economics, Elsevier, vol. 138(C), pages 1-16.
    19. Cornick, Jorge & Trejos, Alberto, 2016. "Building Public Capabilities for Productive Development Policies: Costa Rican Case Studies," IDB Publications (Working Papers) 8017, Inter-American Development Bank.
    20. Woolcock, Michael, 2013. "Using Case Studies to Explore the External Validity of 'Complex' Development Interventions," Working Paper Series rwp13-048, Harvard University, John F. Kennedy School of Government.
    21. Matthijs Janssen, 2016. "What bangs for your bucks? Assessing the design and impact of transformative policy," Innovation Studies Utrecht (ISU) working paper series 16-05, Utrecht University, Department of Innovation Studies, revised Dec 2016.

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:eee:epplan:v:67:y:2018:i:c:p:189-199. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: Catherine Liu (email available below). General contact details of provider: http://www.elsevier.com/locate/evalprogplan .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.