IDEAS home Printed from https://ideas.repec.org/a/sae/evarev/v47y2023i3p563-593.html
   My bibliography  Save this article

Can Non-Randomised Studies of Interventions Provide Unbiased Effect Estimates? A Systematic Review of Internal Replication Studies

Author

Listed:
  • Hugh Sharma Waddington
  • Paul Fenton Villar
  • Jeffrey C. Valentine

Abstract

Non-randomized studies of intervention effects (NRS), also called quasi-experiments, provide useful decision support about development impacts. However, the assumptions underpinning them are usually untestable, their verification resting on empirical replication. The internal replication study aims to do this by comparing results from a causal benchmark study, usually a randomized controlled trial (RCT), with those from an NRS conducted at the same time in the sampled population. We aimed to determine the credibility and generalizability of findings in internal replication studies in development economics, through a systematic review and meta-analysis. We systematically searched for internal replication studies of RCTs conducted on socioeconomic interventions in low- and middle-income countries. We critically appraised the benchmark randomized studies, using an adapted tool. We extracted and statistically synthesized empirical measures of bias. We included 600 estimates of correspondence between NRS and benchmark RCTs. All internal replication studies were found to have at least “some concerns†about bias and some had high risk of bias. We found that study designs with selection on unobservables, in particular regression discontinuity, on average produced absolute standardized bias estimates that were approximately zero, that is, equivalent to the estimates produced by RCTs. But study conduct also mattered. For example, matching using pre-tests and nearest neighbor algorithms corresponded more closely to the benchmarks. The findings from this systematic review confirm that NRS can produce unbiased estimates. Authors of internal replication studies should publish pre-analysis protocols to enhance their credibility.

Suggested Citation

  • Hugh Sharma Waddington & Paul Fenton Villar & Jeffrey C. Valentine, 2023. "Can Non-Randomised Studies of Interventions Provide Unbiased Effect Estimates? A Systematic Review of Internal Replication Studies," Evaluation Review, , vol. 47(3), pages 563-593, June.
  • Handle: RePEc:sae:evarev:v:47:y:2023:i:3:p:563-593
    DOI: 10.1177/0193841X221116721
    as

    Download full text from publisher

    File URL: https://journals.sagepub.com/doi/10.1177/0193841X221116721
    Download Restriction: no

    File URL: https://libkey.io/10.1177/0193841X221116721?utm_source=ideas
    LibKey link: if access is restricted and if your library uses this service, LibKey will redirect you to where you can use your library subscription to access this item
    ---><---

    References listed on IDEAS

    as
    1. Willa Friedman & Michael Kremer & Edward Miguel & Rebecca Thornton, 2016. "Education as Liberation?," Economica, London School of Economics and Political Science, vol. 83(329), pages 1-30, January.
    2. Galiani, Sebastian & McEwan, Patrick J., 2013. "The heterogeneous impact of conditional cash transfers," Journal of Public Economics, Elsevier, vol. 103(C), pages 85-96.
    3. Steven Glazerman & Dan M. Levy & David Myers, 2003. "Nonexperimental Versus Experimental Estimates of Earnings Impacts," The ANNALS of the American Academy of Political and Social Science, , vol. 589(1), pages 63-93, September.
    4. Matias Busso & John DiNardo & Justin McCrary, 2014. "New Evidence on the Finite Sample Properties of Propensity Score Reweighting and Matching Estimators," The Review of Economics and Statistics, MIT Press, vol. 96(5), pages 885-897, December.
    5. Maluccio, John A. & Flores, Rafael, 2005. "Impact evaluation of a conditional cash transfer program: the Nicaraguan Red de Protección Social," Research reports 141, International Food Policy Research Institute (IFPRI).
    6. Peter Z. Schochet, "undated". "Technical Methods Report: Statistical Power for Regression Discontinuity Designs in Education Evaluations," Mathematica Policy Research Reports 61fb6c057561451a8a6074508, Mathematica Policy Research.
    7. David McKenzie & John Gibson & Steven Stillman, 2010. "How Important Is Selection? Experimental vs. Non-Experimental Measures of the Income Gains from Migration," Journal of the European Economic Association, MIT Press, vol. 8(4), pages 913-945, June.
    8. Glewwe, Paul & Kremer, Michael & Moulin, Sylvie & Zitzewitz, Eric, 2004. "Retrospective vs. prospective analyses of school inputs: the case of flip charts in Kenya," Journal of Development Economics, Elsevier, vol. 74(1), pages 251-268, June.
    9. A. Smith, Jeffrey & E. Todd, Petra, 2005. "Does matching overcome LaLonde's critique of nonexperimental estimators?," Journal of Econometrics, Elsevier, vol. 125(1-2), pages 305-353.
    10. Luis Rubalcava & Graciela Teruel & Duncan Thomas, 2009. "Investments, Time Preferences, and Public Transfers Paid to Women," Economic Development and Cultural Change, University of Chicago Press, vol. 57(3), pages 507-538, April.
    11. Felipe Barrera-Osorio & Deon Filmer, 2016. "Incentivizing Schooling for Learning: Evidence on the Impact of Alternative Targeting Approaches," Journal of Human Resources, University of Wisconsin Press, vol. 51(2), pages 461-499.
    12. Jose Urquieta & Gustavo Angeles & Thomas Mroz & Hector Lamadrid-Figueroa & Bernardo Hernández, 2009. "Impact of Oportunidades on Skilled Attendance at Delivery in Rural Areas," Economic Development and Cultural Change, University of Chicago Press, vol. 57(3), pages 539-558, April.
    13. Angelucci, Manuela & De Giorgi, Giacomo, 2006. "Indirect Effects of an Aid Program: The Case of Progresa and Consumption," IZA Discussion Papers 1955, Institute of Labor Economics (IZA).
    14. Hector Lamadrid-Figueroa & Gustavo Angeles & Thomas Mroz & Jose Urquieta-Salomon & Bernardo Hernandez-Prado & Aurelio Cruz-Valdez & Martha Tellez-Rojo, 2010. "Heterogeneous impact of the social programme Oportunidades on use of contraceptive methods by young adult women living in rural areas," Journal of Development Effectiveness, Taylor & Francis Journals, vol. 2(1), pages 74-86.
    15. Juan Jose Diaz & Sudhanshu Handa, 2006. "An Assessment of Propensity Score Matching as a Nonexperimental Impact Estimator: Evidence from Mexico’s PROGRESA Program," Journal of Human Resources, University of Wisconsin Press, vol. 41(2).
    16. repec:mpr:mprres:6372 is not listed on IDEAS
    17. David S. Lee & Thomas Lemieux, 2010. "Regression Discontinuity Designs in Economics," Journal of Economic Literature, American Economic Association, vol. 48(2), pages 281-355, June.
    18. Shayda Mae Sabet & Annette N. Brown, 2018. "Is impact evaluation still on the rise? The new trends in 2010–2015," Journal of Development Effectiveness, Taylor & Francis Journals, vol. 10(3), pages 291-304, July.
    19. Sudhanshu Handa & John A. Maluccio, 2010. "Matching the Gold Standard: Comparing Experimental and Nonexperimental Evaluation Techniques for a Geographically Targeted Program," Economic Development and Cultural Change, University of Chicago Press, vol. 58(3), pages 415-447, April.
    20. Paul Fenton Villar & Hugh Waddington, 2019. "Within study comparisons and risk of bias in international development: Systematic review and critical appraisal," Campbell Systematic Reviews, John Wiley & Sons, vol. 15(1-2), June.
    21. repec:mpr:mprres:3694 is not listed on IDEAS
    22. Eva Vivalt, 2020. "How Much Can We Generalize From Impact Evaluations?," Journal of the European Economic Association, European Economic Association, vol. 18(6), pages 3045-3089.
    23. Buddelmeyer, Hielke & Skoufias, Emmanuel, 2003. "An Evaluation of the Performance of Regression Discontinuity Design on PROGRESA," IZA Discussion Papers 827, Institute of Labor Economics (IZA).
    24. Skoufias, Emmanuel & Davis, Benjamin & de la Vega, Sergio, 2001. "Targeting the Poor in Mexico: An Evaluation of the Selection of Households into PROGRESA," World Development, Elsevier, vol. 29(10), pages 1769-1784, October.
    Full references (including those not matched with items on IDEAS)

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Henrik Hansen & Ninja Ritter Klejnstrup & Ole Winckler Andersen, 2011. "A Comparison of Model-based and Design-based Impact Evaluations of Interventions in Developing Countries," IFRO Working Paper 2011/16, University of Copenhagen, Department of Food and Resource Economics.
    2. Vivian C. Wong & Peter M. Steiner & Kylie L. Anglin, 2018. "What Can Be Learned From Empirical Evaluations of Nonexperimental Methods?," Evaluation Review, , vol. 42(2), pages 147-175, April.
    3. Jeffrey Smith & Arthur Sweetman, 2016. "Viewpoint: Estimating the causal effects of policies and programs," Canadian Journal of Economics, Canadian Economics Association, vol. 49(3), pages 871-905, August.
    4. Duflo, Esther & Glennerster, Rachel & Kremer, Michael, 2008. "Using Randomization in Development Economics Research: A Toolkit," Handbook of Development Economics, in: T. Paul Schultz & John A. Strauss (ed.), Handbook of Development Economics, edition 1, volume 4, chapter 61, pages 3895-3962, Elsevier.
    5. Ferraro, Paul J. & Miranda, Juan José, 2014. "The performance of non-experimental designs in the evaluation of environmental programs: A design-replication study using a large-scale randomized experiment as a benchmark," Journal of Economic Behavior & Organization, Elsevier, vol. 107(PA), pages 344-365.
    6. Justine Burns & Malcolm Kewsell & Rebecca Thornton, 2009. "Evaluating the Impact of Health Programmes," SALDRU Working Papers 40, Southern Africa Labour and Development Research Unit, University of Cape Town.
    7. Michael Clemens & Erwin Tiongson, 2012. "Split Decisions: Family finance when a policy discontinuity allocates overseas work," RF Berlin - CReAM Discussion Paper Series 1234, Rockwool Foundation Berlin (RF Berlin) - Centre for Research and Analysis of Migration (CReAM).
    8. Black, Dan A. & Joo, Joonhwi & LaLonde, Robert & Smith, Jeffrey A. & Taylor, Evan J., 2022. "Simple Tests for Selection: Learning More from Instrumental Variables," Labour Economics, Elsevier, vol. 79(C).
    9. Melba V. Tutor, 2014. "The impact of the PhilippinesÕ conditional cash transfer program on consumption," Philippine Review of Economics, University of the Philippines School of Economics and Philippine Economic Society, vol. 51(1), pages 117-161, June.
    10. Halldén, Karin & Stenberg, Anders, 2013. "The Relationship between Hours of Domestic Services and Female Earnings: Panel Register Data Evidence from a Reform," Working Paper Series 4/2013, Stockholm University, Swedish Institute for Social Research.
    11. Galiani, Sebastian & McEwan, Patrick J., 2013. "The heterogeneous impact of conditional cash transfers," Journal of Public Economics, Elsevier, vol. 103(C), pages 85-96.
    12. Cook, Thomas D., 2008. ""Waiting for Life to Arrive": A history of the regression-discontinuity design in Psychology, Statistics and Economics," Journal of Econometrics, Elsevier, vol. 142(2), pages 636-654, February.
    13. Maureen A. Pirog & Anne L. Buffardi & Colleen K. Chrisinger & Pradeep Singh & John Briney, 2009. "Are the alternatives to randomized assignment nearly as good? Statistical corrections to nonrandomized evaluations," Journal of Policy Analysis and Management, John Wiley & Sons, Ltd., vol. 28(1), pages 169-172.
    14. Sebastian Galiani & Patrick J. McEwan & Brian Quistorff, 2017. "External and Internal Validity of a Geographic Quasi-Experiment Embedded in a Cluster-Randomized Experiment," Advances in Econometrics, in: Regression Discontinuity Designs, volume 38, pages 195-236, Emerald Group Publishing Limited.
    15. Thomas D. Cook & William R. Shadish & Vivian C. Wong, 2008. "Three conditions under which experiments and observational studies produce comparable causal estimates: New findings from within-study comparisons," Journal of Policy Analysis and Management, John Wiley & Sons, Ltd., vol. 27(4), pages 724-750.
    16. Chakravarty, Shubha & Lundberg, Mattias & Nikolov, Plamen & Zenker, Juliane, 2019. "Vocational training programs and youth labor market outcomes: Evidence from Nepal," Journal of Development Economics, Elsevier, vol. 136(C), pages 71-110.
    17. Anders Stenberg & Olle Westerlund, 2015. "The long-term earnings consequences of general vs. specific training of the unemployed," IZA Journal of European Labor Studies, Springer;Forschungsinstitut zur Zukunft der Arbeit GmbH (IZA), vol. 4(1), pages 1-26, December.
    18. Ravallion, Martin, 2008. "Evaluating Anti-Poverty Programs," Handbook of Development Economics, in: T. Paul Schultz & John A. Strauss (ed.), Handbook of Development Economics, edition 1, volume 4, chapter 59, pages 3787-3846, Elsevier.
    19. Alejandro J. Ganimian & Richard J. Murnane, 2014. "Improving Educational Outcomes in Developing Countries: Lessons from Rigorous Impact Evaluations," NBER Working Papers 20284, National Bureau of Economic Research, Inc.
    20. Robert D. Osei & Monica Lambon‐Quayefio, 2021. "Cash transfers and the supply of labor by poor households: Evidence from the livelihood empowerment against poverty program in Ghana," Review of Development Economics, Wiley Blackwell, vol. 25(3), pages 1293-1304, August.

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:sae:evarev:v:47:y:2023:i:3:p:563-593. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: SAGE Publications (email available below). General contact details of provider: .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.