IDEAS home Printed from https://ideas.repec.org/a/eee/epplan/v55y2016icp155-162.html
   My bibliography  Save this article

Why so many “rigorous” evaluations fail to identify unintended consequences of development programs: How mixed methods can contribute

Author

Listed:
  • Bamberger, Michael
  • Tarsilla, Michele
  • Hesse-Biber, Sharlene

Abstract

Many widely-used impact evaluation designs, including randomized control trials (RCTs) and quasi-experimental designs (QEDs), frequently fail to detect what are often quite serious unintended consequences of development programs. This seems surprising as experienced planners and evaluators are well aware that unintended consequences frequently occur. Most evaluation designs are intended to determine whether there is credible evidence (statistical, theory-based or narrative) that programs have achieved their intended objectives and the logic of many evaluation designs, even those that are considered the most “rigorous,” does not permit the identification of outcomes that were not specified in the program design. We take the example of RCTs as they are considered by many to be the most rigorous evaluation designs. We present a numbers of cases to illustrate how infusing RCTs with a mixed-methods approach (sometimes called an “RCT+” design) can strengthen the credibility of these designs and can also capture important unintended consequences. We provide a Mixed Methods Evaluation Framework that identifies 9 ways in which UCs can occur, and we apply this framework to two of the case studies.

Suggested Citation

  • Bamberger, Michael & Tarsilla, Michele & Hesse-Biber, Sharlene, 2016. "Why so many “rigorous” evaluations fail to identify unintended consequences of development programs: How mixed methods can contribute," Evaluation and Program Planning, Elsevier, vol. 55(C), pages 155-162.
  • Handle: RePEc:eee:epplan:v:55:y:2016:i:c:p:155-162
    DOI: 10.1016/j.evalprogplan.2016.01.001
    as

    Download full text from publisher

    File URL: http://www.sciencedirect.com/science/article/pii/S0149718916000021
    Download Restriction: Full text for ScienceDirect subscribers only

    File URL: https://libkey.io/10.1016/j.evalprogplan.2016.01.001?utm_source=ideas
    LibKey link: if access is restricted and if your library uses this service, LibKey will redirect you to where you can use your library subscription to access this item
    ---><---

    As the access to this document is restricted, you may want to search for a different version of it.

    References listed on IDEAS

    as
    1. Abhijit Vinayak Banerjee & Alice H. Amsden & Robert H. Bates & Jagdish Bhagwati & Angus Deaton & Nicholas Stern, 2007. "Making Aid Work," MIT Press Books, The MIT Press, edition 1, volume 1, number 0262026155, April.
    2. repec:pri:cheawb:deaton%20instruments%20of%20development%20keynes%20lecture%202009 is not listed on IDEAS
    3. Angus Deaton, 2009. "Instruments of development: Randomization in the tropics, and the search for the elusive keys to economic development," Working Papers 1128, Princeton University, Woodrow Wilson School of Public and International Affairs, Center for Health and Wellbeing..
    4. Julian C. Jamison & Dean Karlan & Pia Raffler, 2013. "Mixed Method Evaluation of a Passive mHealth Sexual Information Texting Service in Uganda," NBER Working Papers 19107, National Bureau of Economic Research, Inc.
    5. Michael Bamberger, 2015. "Innovations in the use of mixed methods in real-world evaluation," Journal of Development Effectiveness, Taylor & Francis Journals, vol. 7(3), pages 317-326, September.
    6. repec:pri:rpdevs:instruments_of_development is not listed on IDEAS
    7. repec:pri:cheawb:deaton%20instruments%20of%20development%20keynes%20lecture%202009.pdf is not listed on IDEAS
    8. Christopher B. Barrett & Michael R. Carter, 2010. "The Power and Pitfalls of Experiments in Development Economics: Some Non-random Reflections," Applied Economic Perspectives and Policy, Agricultural and Applied Economics Association, vol. 32(4), pages 515-548.
    9. Ravallion Martin, 2009. "Should the Randomistas Rule?," The Economists' Voice, De Gruyter, vol. 6(2), pages 1-5, February.
    10. Basu, Kaushik, 2013. "The method of randomization and the role of reasoned intuition," Policy Research Working Paper Series 6722, The World Bank.
    11. Paul J. Gertler & Sebastian Martinez & Patrick Premand & Laura B. Rawlings & Christel M. J. Vermeersch, . "Impact Evaluation in Practice, First Edition [La evaluación de impacto en la práctica]," World Bank Publications, The World Bank, number 2550, September.
    12. repec:pri:rpdevs:instruments_of_development.pdf is not listed on IDEAS
    Full references (including those not matched with items on IDEAS)

    Citations

    Citations are extracted by the CitEc Project, subscribe to its RSS feed for this item.
    as


    Cited by:

    1. de Alteriis, Martin, 2020. "What can we learn about unintended consequences from a textual analysis of monitoring reports and evaluations for U.S. foreign assistance programs?," Evaluation and Program Planning, Elsevier, vol. 79(C).
    2. Koch, Dirk-Jan & Schulpen, Lau, 2018. "Introduction to the special issue ‘unintended effects of international cooperation’," Evaluation and Program Planning, Elsevier, vol. 68(C), pages 202-209.
    3. Davidson, Angus Alexander & Young, Michael Denis & Leake, John Espie & O’Connor, Patrick, 2022. "Aid and forgetting the enemy: A systematic review of the unintended consequences of international development in fragile and conflict-affected situations," Evaluation and Program Planning, Elsevier, vol. 92(C).
    4. Peterson, Christina & Skolits, Gary, 2019. "Evaluating unintended program outcomes through Ripple Effects Mapping (REM): Application of REM using grounded theory," Evaluation and Program Planning, Elsevier, vol. 76(C), pages 1-1.
    5. Schuster, Roseanne C. & Brewis, Alexandra & Wutich, Amber & Safi, Christelle & Vanrespaille, Teresa Elegido & Bowen, Gina & SturtzSreetharan, Cindi & McDaniel, Anne & Ochandarena, Peggy, 2023. "Individual interviews versus focus groups for evaluations of international development programs: Systematic testing of method performance to elicit sensitive information in a justice study in Haiti," Evaluation and Program Planning, Elsevier, vol. 97(C).
    6. Smith, Jonathan D., 2017. "Positioning Missionaries in Development Studies, Policy, and Practice," World Development, Elsevier, vol. 90(C), pages 63-76.
    7. Morell, Jonathan A., 2018. "Systematic iteration between model and methodology: A proposed approach to evaluating unintended consequences," Evaluation and Program Planning, Elsevier, vol. 68(C), pages 243-252.
    8. Federica Cisilino & Alessandro Monteleone, 2020. "Designing Rural Policies for Sustainable Innovations through a Participatory Approach," Sustainability, MDPI, vol. 12(21), pages 1-17, November.
    9. Pelucha, Martin & Kveton, Viktor & Potluka, Oto, 2019. "Using mixed method approach in measuring effects of training in firms: Case study of the European Social Fund support," Evaluation and Program Planning, Elsevier, vol. 73(C), pages 146-155.

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Sophie Webber, 2015. "Randomising Development: Geography, Economics and the Search for Scientific Rigour," Tijdschrift voor Economische en Sociale Geografie, Royal Dutch Geographical Society KNAG, vol. 106(1), pages 36-52, February.
    2. Cornwall, Andrea & Aghajanian, Alia, 2017. "How to Find out What’s Really Going On: Understanding Impact through Participatory Process Evaluation," World Development, Elsevier, vol. 99(C), pages 173-185.
    3. Gisselquist, Rachel & Niño-Zarazúa, Miguel, 2013. "What can experiments tell us about how to improve governance?," MPRA Paper 49300, University Library of Munich, Germany.
    4. Olofsgård, Anders, 2012. "The Politics of Aid Effectiveness: Why Better Tools can Make for Worse Outcomes," SITE Working Paper Series 16, Stockholm School of Economics, Stockholm Institute of Transition Economics.
    5. Guido W. Imbens, 2010. "Better LATE Than Nothing: Some Comments on Deaton (2009) and Heckman and Urzua (2009)," Journal of Economic Literature, American Economic Association, vol. 48(2), pages 399-423, June.
    6. William Easterly, 2009. "Can the West Save Africa?," Journal of Economic Literature, American Economic Association, vol. 47(2), pages 373-447, June.
    7. Meier zu Selhausen, Felix, 2016. "Women's empowerment in Uganda: colonial roots and contemporary efforts, 1894-2012," Economics PhD Theses 0715, Department of Economics, University of Sussex Business School.
    8. F. Meier zu Selhausen & E. Stam, 2013. "Husbands and Wives. The powers and perils of participation in a microfinance cooperative for female entrepreneurs," Working Papers 13-10, Utrecht School of Economics.
    9. Smith, Lisa C. & Khan, Faheem & Frankenberger, Timothy R. & Wadud, A.K.M. Abdul, 2013. "Admissible Evidence in the Court of Development Evaluation? The Impact of CARE’s SHOUHARDO Project on Child Stunting in Bangladesh," World Development, Elsevier, vol. 41(C), pages 196-216.
    10. Florent Bédécarrats & Isabelle Guérin & François Roubaud, 2015. "The gold standard for randomized evaluations: from discussion of method to political economy," Working Papers DT/2015/01, DIAL (Développement, Institutions et Mondialisation).
    11. Ricker-Gilbert, Jacob & Jones, Michael, 2015. "Does storage technology affect adoption of improved maize varieties in Africa? Insights from Malawi’s input subsidy program," Food Policy, Elsevier, vol. 50(C), pages 92-105.
    12. Tilman Brück & Neil T. N. Ferguson, 2020. "Money can’t buy love but can it buy peace? Evidence from the EU Programme for Peace and Reconciliation (PEACE II)," Conflict Management and Peace Science, Peace Science Society (International), vol. 37(5), pages 536-558, September.
    13. Florent Bedecarrats & Isabelle Guérin & François Roubaud, 2017. "L'étalon-or des évaluations randomisées : du discours de la méthode à l'économie politique," Working Papers ird-01445209, HAL.
    14. Gisselquist, Rachel & Niño-Zarazúa, Miguel, 2013. "What can experiments tell us about how to improve governance?," MPRA Paper 49300, University Library of Munich, Germany.
    15. Sara Nadel and Lant Pritchett, 2016. "Searching for the Devil in the Details: Learning about Development Program Design," Working Papers 434, Center for Global Development.
    16. Maredia, Mywish K., 2009. "Improving the proof: Evolution of and emerging trends in impact assessment methods and approaches in agricultural development," IFPRI discussion papers 929, International Food Policy Research Institute (IFPRI).
    17. Gwenolé Le Velly & Céline Dutilly, 2016. "Evaluating Payments for Environmental Services: Methodological Challenges," PLOS ONE, Public Library of Science, vol. 11(2), pages 1-20, February.
    18. Beliyou Haile & Carlo Azzarri & Cleo Roberts & David J. Spielman, 2017. "Targeting, bias, and expected impact of complex innovations on developing-country agriculture: evidence from Malawi," Agricultural Economics, International Association of Agricultural Economists, vol. 48(3), pages 317-326, May.
    19. Doss, Cheryl, 2013. "Intrahousehold bargaining and resource allocation in developing countries," Policy Research Working Paper Series 6337, The World Bank.
    20. David McKenzie, 2010. "Impact Assessments in Finance and Private Sector Development: What Have We Learned and What Should We Learn?," The World Bank Research Observer, World Bank, vol. 25(2), pages 209-233, August.

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:eee:epplan:v:55:y:2016:i:c:p:155-162. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: Catherine Liu (email available below). General contact details of provider: http://www.elsevier.com/locate/evalprogplan .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.