IDEAS home Printed from https://ideas.repec.org/p/cid/wpfacu/270.html
   My bibliography  Save this paper

Using Case Studies to Explore the External Validity of ‘Complex’ Development Interventions

Author

Listed:
  • Michael Woolcock

    (Center for International Development at Harvard University)

Abstract

Rising standards for accurately inferring the impact of development projects has not been matched by equivalently rigorous procedures for guiding decisions about whether and how similar results might be expected elsewhere. These 'external validity' concerns are especially pressing for 'complex' development interventions, in which the explicit purpose is often to adapt projects to local contextual realities and where high quality implementation is paramount to success. A basic analytical framework is provided for assessing the external validity of complex development interventions. It argues for deploying case studies to better identify the conditions under which diverse outcomes are observed, focusing in particular on the salience of contextual idiosyncrasies, implementation capabilities and trajectories of change. Upholding the canonical methodological principle that questions should guide methods, not vice versa, is required if a truly rigorous basis for generalizing claims about likely impact across time, groups, contexts and scales of operation is to be discerned for different kinds of development interventions.

Suggested Citation

  • Michael Woolcock, 2013. "Using Case Studies to Explore the External Validity of ‘Complex’ Development Interventions," CID Working Papers 270, Center for International Development at Harvard University.
  • Handle: RePEc:cid:wpfacu:270
    as

    Download full text from publisher

    File URL: https://www.hks.harvard.edu/sites/default/files/centers/cid/files/publications/faculty-working-papers/270_Woolcock.pdf
    Download Restriction: no
    ---><---

    Other versions of this item:

    References listed on IDEAS

    as
    1. Angus Deaton, 2010. "Instruments, Randomization, and Learning about Development," Journal of Economic Literature, American Economic Association, vol. 48(2), pages 424-455, June.
    2. Abhijit V. Banerjee & Shawn Cole & Esther Duflo & Leigh Linden, 2007. "Remedying Education: Evidence from Two Randomized Experiments in India," The Quarterly Journal of Economics, President and Fellows of Harvard College, vol. 122(3), pages 1235-1264.
    3. Jens Ludwig & Jeffrey R. Kling & Sendhil Mullainathan, 2011. "Mechanism Experiments and Policy Evaluations," Journal of Economic Perspectives, American Economic Association, vol. 25(3), pages 17-38, Summer.
    4. Michael Woolcock & Simon Szreter & Vijayendra Rao, 2011. "How and Why Does History Matter for Development Policy?," Journal of Development Studies, Taylor & Francis Journals, vol. 47(1), pages 70-96.
    5. James Mahoney, 2000. "Strategies of Causal Inference in Small-N Analysis," Sociological Methods & Research, , vol. 28(4), pages 387-424, May.
    6. Katherine Casey & Rachel Glennerster & Edward Miguel, 2012. "Reshaping Institutions: Evidence on Aid Impacts Using a Preanalysis Plan," The Quarterly Journal of Economics, President and Fellows of Harvard College, vol. 127(4), pages 1755-1812.
    7. Duflo, Esther & Dupas, Pascaline & Kremer, Michael, 2015. "School governance, teacher incentives, and pupil–teacher ratios: Experimental evidence from Kenyan primary schools," Journal of Public Economics, Elsevier, vol. 123(C), pages 92-110.
    8. Ravallion, Martin, 2001. "Growth, Inequality and Poverty: Looking Beyond Averages," World Development, Elsevier, vol. 29(11), pages 1803-1815, November.
    9. Pritchett, Lant & Woolcock, Michael, 2004. "Solutions When the Solution is the Problem: Arraying the Disarray in Development," World Development, Elsevier, vol. 32(2), pages 191-212, February.
    10. Paul Shaffer, 2011. "Against Excessive Rhetoric in Impact Assessment: Overstating the Case for Randomised Controlled Experiments," Journal of Development Studies, Taylor & Francis Journals, vol. 47(11), pages 1619-1635.
    11. repec:unu:wpaper:wp2012-64 is not listed on IDEAS
    12. Ravallion Martin, 2009. "Should the Randomistas Rule?," The Economists' Voice, De Gruyter, vol. 6(2), pages 1-5, February.
    13. Lant Pritchett & Salimah Samji & Jeffrey S. Hammer, 2012. "It's All about MeE: Using Structured Experiential Learning ('e') to Crawl the Design Space," WIDER Working Paper Series wp-2012-104, World Institute for Development Economic Research (UNU-WIDER).
    14. Ariel Fiszbein & Norbert Schady & Francisco H.G. Ferreira & Margaret Grosh & Niall Keleher & Pedro Olinto & Emmanuel Skoufias, 2009. "Conditional Cash Transfers : Reducing Present and Future Poverty," World Bank Publications - Books, The World Bank Group, number 2597.
    15. Bruhn, Miriam & McKenzie, David, 2013. "Using administrative data to evaluate municipal reforms : an evaluation of the impact of Minas Facil Expresso," Policy Research Working Paper Series 6368, The World Bank.
    16. Lant Pritchett & Salimah Samji & Jeffrey Hammer, 2012. "It’s All About MeE: Using Structured Experiential Learning (‘e’) to Crawl the Design Space," CID Working Papers 249, Center for International Development at Harvard University.
    17. Tessa Bold & Mwangi Kimenyi & Germano Mwabu & Alice Ng'ang'a & Justin Sandefur, 2013. "Scaling-up What Works: Experimental Evidence on External Validity in Kenyan Education," CSAE Working Paper Series 2013-04, Centre for the Study of African Economies, University of Oxford.
    18. Lant Pritchett & Michael Woolcock & Matt Andrews, 2013. "Looking Like a State: Techniques of Persistent Failure in State Capability for Implementation," Journal of Development Studies, Taylor & Francis Journals, vol. 49(1), pages 1-18, January.
    19. repec:pri:rpdevs:hammer_its_all_about_me is not listed on IDEAS
    20. Miriam Bruhn & David McKenzie, 2013. "Using administrative data to evaluate municipal reforms: an evaluation of the impact of Minas F�cil Expresso," Journal of Development Effectiveness, Taylor & Francis Journals, vol. 5(3), pages 319-338, September.
    Full references (including those not matched with items on IDEAS)

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Woolcock, Michael, 2013. "Using Case Studies to Explore the External Validity of 'Complex' Development Interventions," Working Paper Series rwp13-048, Harvard University, John F. Kennedy School of Government.
    2. Lant Pritchett, Justin Sandefur, 2013. "Context Matters for Size: Why External Validity Claims and Development Practice Don't Mix-Working Paper 336," Working Papers 336, Center for Global Development.
    3. Karthik Muralidharan & Paul Niehaus, 2017. "Experimentation at Scale," Journal of Economic Perspectives, American Economic Association, vol. 31(4), pages 103-124, Fall.
    4. Florent Bédécarrats & Isabelle Guérin & François Roubaud, 2019. "All that Glitters is not Gold. The Political Economy of Randomized Evaluations in Development," Development and Change, International Institute of Social Studies, vol. 50(3), pages 735-762, May.
    5. Florent BEDECARRATS & Isabelle GUERIN & François ROUBAUD, 2017. "L'étalon-or des évaluations randomisées : économie politique des expérimentations aléatoires dans le domaine du développement," Working Paper 753120cd-506f-4c5f-80ed-7, Agence française de développement.
    6. Mo, Di & Bai, Yu & Shi, Yaojiang & Abbey, Cody & Zhang, Linxiu & Rozelle, Scott & Loyalka, Prashant, 2020. "Institutions, implementation, and program effectiveness: Evidence from a randomized evaluation of computer-assisted learning in rural China," Journal of Development Economics, Elsevier, vol. 146(C).
    7. Jason T. Kerwin & Rebecca L. Thornton, 2021. "Making the Grade: The Sensitivity of Education Program Effectiveness to Input Choices and Outcome Measures," The Review of Economics and Statistics, MIT Press, vol. 103(2), pages 251-264, May.
    8. Annie Duflo & Jessica Kiessel & Adrienne Lucas, 2020. "Experimental Evidence on Alternative Policies to Increase Learning at Scale," NBER Working Papers 27298, National Bureau of Economic Research, Inc.
    9. Baylis, Kathy & Ham, Andres, 2015. "How important is spatial correlation in randomized controlled trials?," 2015 AAEA & WAEA Joint Annual Meeting, July 26-28, San Francisco, California 205586, Agricultural and Applied Economics Association.
    10. Jörg Peters & Jörg Langbein & Gareth Roberts, 2018. "Generalization in the Tropics – Development Policy, Randomized Controlled Trials, and External Validity," The World Bank Research Observer, World Bank, vol. 33(1), pages 34-64.
    11. Karthik Muralidharan & Venkatesh Sundararaman, 2013. "Contract Teachers: Experimental Evidence from India," NBER Working Papers 19440, National Bureau of Economic Research, Inc.
    12. Michael Clemens, Gabriel Demombynes, 2013. "The New Transparency in Development Economics: Lessons from the Millennium Villages Controversy," Working Papers 342, Center for Global Development.
    13. Sedlmayr, Richard & Shah, Anuj & Sulaiman, Munshi, 2020. "Cash-plus: Poverty impacts of alternative transfer-based approaches," Journal of Development Economics, Elsevier, vol. 144(C).
    14. Lant Pritchett & Justin Sandefur, 2015. "Learning from Experiments When Context Matters," American Economic Review, American Economic Association, vol. 105(5), pages 471-475, May.
    15. Andrews, Matt & Pritchett, Lant & Woolcock, Michael, 2017. "Building State Capability: Evidence, Analysis, Action," OUP Catalogue, Oxford University Press, number 9780198747482.
    16. Cameron, Lisa & Olivia, Susan & Shah, Manisha, 2019. "Scaling up sanitation: Evidence from an RCT in Indonesia," Journal of Development Economics, Elsevier, vol. 138(C), pages 1-16.
    17. Gentilini, Ugo & Omamo, Steven Were, 2011. "Social protection 2.0: Exploring issues, evidence and debates in a globalizing world," Food Policy, Elsevier, vol. 36(3), pages 329-340, June.
    18. Eva Vivalt, 2015. "Heterogeneous Treatment Effects in Impact Evaluation," American Economic Review, American Economic Association, vol. 105(5), pages 467-470, May.
    19. Benjamin A. Olken, 2020. "Banerjee, Duflo, Kremer, and the Rise of Modern Development Economics," Scandinavian Journal of Economics, Wiley Blackwell, vol. 122(3), pages 853-878, July.

    More about this item

    Keywords

    Case Studies; External Validity; Complexity; Evaluation;
    All these keywords.

    JEL classification:

    • O1 - Economic Development, Innovation, Technological Change, and Growth - - Economic Development
    • B40 - Schools of Economic Thought and Methodology - - Economic Methodology - - - General

    Statistics

    Access and download statistics

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:cid:wpfacu:270. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: Chuck McKenney (email available below). General contact details of provider: https://edirc.repec.org/data/ciharus.html .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.