IDEAS home Printed from https://ideas.repec.org/a/bpj/globdv/v6y2015i1p47-69n1.html
   My bibliography  Save this article

Experimental and Non-Experimental Methods in Development Economics: A Porous Dialectic

Author

Listed:
  • Dehejia Rajeev

    (New York University, CESifo, IZA, and NBER, NYU Wagner, 295 Lafayette Street, 2nd floor, New York, NY 10012)

Abstract

This paper surveys six widely-used non-experimental methods for estimating treatment effects (instrumental variables, regression discontinuity, direct matching, propensity score matching, linear regression and non-parametric methods, and difference-in-differences), and assesses their internal and external validity relative both to each other and to randomized controlled trials. While randomized controlled trials can achieve the highest degree of internal validity when cleanly implemented in the field, the availability of large, nationally representative data sets offers the opportunity for a high degree of external validity using non-experimental methods. We argue that each method has merits in some context and they are complements rather than substitutes.

Suggested Citation

  • Dehejia Rajeev, 2015. "Experimental and Non-Experimental Methods in Development Economics: A Porous Dialectic," Journal of Globalization and Development, De Gruyter, vol. 6(1), pages 47-69, June.
  • Handle: RePEc:bpj:globdv:v:6:y:2015:i:1:p:47-69:n:1
    DOI: 10.1515/jgd-2014-0005
    as

    Download full text from publisher

    File URL: https://doi.org/10.1515/jgd-2014-0005
    Download Restriction: For access to full text, subscription to the journal or payment for the individual article is required.

    File URL: https://libkey.io/10.1515/jgd-2014-0005?utm_source=ideas
    LibKey link: if access is restricted and if your library uses this service, LibKey will redirect you to where you can use your library subscription to access this item
    ---><---

    As the access to this document is restricted, you may want to search for a different version of it.

    References listed on IDEAS

    as
    1. LaLonde, Robert J, 1986. "Evaluating the Econometric Evaluations of Training Programs with Experimental Data," American Economic Review, American Economic Association, vol. 76(4), pages 604-620, September.
    2. James J. Heckman, 2001. "Micro Data, Heterogeneity, and the Evaluation of Public Policy: Nobel Lecture," Journal of Political Economy, University of Chicago Press, vol. 109(4), pages 673-748, August.
    3. A. Smith, Jeffrey & E. Todd, Petra, 2005. "Does matching overcome LaLonde's critique of nonexperimental estimators?," Journal of Econometrics, Elsevier, vol. 125(1-2), pages 305-353.
    4. Card, David & Krueger, Alan B, 1994. "Minimum Wages and Employment: A Case Study of the Fast-Food Industry in New Jersey and Pennsylvania," American Economic Review, American Economic Association, vol. 84(4), pages 772-793, September.
    5. Jere R. Behrman & Yingmei Cheng & Petra E. Todd, 2004. "Evaluating Preschool Programs When Length of Exposure to the Program Varies: A Nonparametric Approach," The Review of Economics and Statistics, MIT Press, vol. 86(1), pages 108-132, February.
    6. Esther Duflo, 2001. "Schooling and Labor Market Consequences of School Construction in Indonesia: Evidence from an Unusual Policy Experiment," American Economic Review, American Economic Association, vol. 91(4), pages 795-813, September.
    7. Imbens, Guido W. & Lemieux, Thomas, 2008. "Regression discontinuity designs: A guide to practice," Journal of Econometrics, Elsevier, vol. 142(2), pages 615-635, February.
    8. Carlos A. Flores & Oscar A. Mitnik, 2013. "Comparing Treatments across Labor Markets: An Assessment of Nonexperimental Multiple-Treatment Strategies," The Review of Economics and Statistics, MIT Press, vol. 95(5), pages 1691-1707, December.
    9. Subramanian, Shankar & Deaton, Angus, 1996. "The Demand for Food and Calories," Journal of Political Economy, University of Chicago Press, vol. 104(1), pages 133-162, February.
    10. Joshua Angrist & Eric Bettinger & Erik Bloom & Elizabeth King & Michael Kremer, 2002. "Vouchers for Private Schooling in Colombia: Evidence from a Randomized Natural Experiment," American Economic Review, American Economic Association, vol. 92(5), pages 1535-1558, December.
    11. Markus Frlich, 2004. "Finite-Sample Properties of Propensity-Score Matching and Weighting Estimators," The Review of Economics and Statistics, MIT Press, vol. 86(1), pages 77-90, February.
    12. Heckman, J.J. & Hotz, V.J., 1988. "Choosing Among Alternative Nonexperimental Methods For Estimating The Impact Of Social Programs: The Case Of Manpower Training," University of Chicago - Economics Research Center 88-12, Chicago - Economics Research Center.
    13. McCrary, Justin, 2008. "Manipulation of the running variable in the regression discontinuity design: A density test," Journal of Econometrics, Elsevier, vol. 142(2), pages 698-714, February.
    14. Wilbert Van Der Klaauw, 2008. "Regression–Discontinuity Analysis: A Survey of Recent Developments in Economics," LABOUR, CEIS, vol. 22(2), pages 219-245, June.
    15. Guido W. Imbens & Jeffrey M. Wooldridge, 2009. "Recent Developments in the Econometrics of Program Evaluation," Journal of Economic Literature, American Economic Association, vol. 47(1), pages 5-86, March.
    16. Keisuke Hirano & Guido W. Imbens & Geert Ridder, 2003. "Efficient Estimation of Average Treatment Effects Using the Estimated Propensity Score," Econometrica, Econometric Society, vol. 71(4), pages 1161-1189, July.
    17. Jeffrey M Wooldridge, 2010. "Econometric Analysis of Cross Section and Panel Data," MIT Press Books, The MIT Press, edition 2, volume 1, number 0262232588, April.
    18. Tseday Jemaneh Mekasha & Finn Tarp, 2013. "Aid and Growth: What Meta-Analysis Reveals," Journal of Development Studies, Taylor & Francis Journals, vol. 49(4), pages 564-583, April.
    19. Busso, Matias & DiNardo, John & McCrary, Justin, 2009. "New Evidence on the Finite Sample Properties of Propensity Score Matching and Reweighting Estimators," IZA Discussion Papers 3998, Institute of Labor Economics (IZA).
    20. Tseday Jemaneh Mekasha & Finn Tarp, 2013. "Aid and Growth: What Meta-Analysis Reveals," Journal of Development Studies, Taylor & Francis Journals, vol. 49(4), pages 564-583, April.
    21. G�nther Fink & Margaret McConnell & Sebastian Vollmer, 2014. "Testing for heterogeneous treatment effects in experimental data: false discovery risks and correction procedures," Journal of Development Effectiveness, Taylor & Francis Journals, vol. 6(1), pages 44-57, January.
    22. Sylvie Moulin & Michael Kremer & Paul Glewwe, 2009. "Many Children Left Behind? Textbooks and Test Scores in Kenya," American Economic Journal: Applied Economics, American Economic Association, vol. 1(1), pages 112-135, January.
    23. James Heckman & Hidehiko Ichimura & Jeffrey Smith & Petra Todd, 1998. "Characterizing Selection Bias Using Experimental Data," Econometrica, Econometric Society, vol. 66(5), pages 1017-1098, September.
    24. Shaikh, Azeem M. & Simonsen, Marianne & Vytlacil, Edward J. & Yildiz, Nese, 2009. "A specification test for the propensity score using its distribution conditional on participation," Journal of Econometrics, Elsevier, vol. 151(1), pages 33-46, July.
    25. Jalan, Jyotsna & Ravallion, Martin, 2003. "Estimating the Benefit Incidence of an Antipoverty Program by Propensity-Score Matching," Journal of Business & Economic Statistics, American Statistical Association, vol. 21(1), pages 19-30, January.
    26. Godtland, Erin M & Sadoulet, Elisabeth & De Janvry, Alain & Murgai, Rinku & Ortiz, Oscar, 2004. "The Impact of Farmer Field Schools on Knowledge and Productivity: A Study of Potato Farmers in the Peruvian Andes," Economic Development and Cultural Change, University of Chicago Press, vol. 53(1), pages 63-92, October.
    27. Martin Huber, 2011. "Testing for covariate balance using quantile regression and resampling methods," Journal of Applied Statistics, Taylor & Francis Journals, vol. 38(12), pages 2881-2899, February.
    28. Dehejia, Rajeev H, 2003. "Was There a Riverside Miracle? A Hierarchical Framework for Evaluating Programs with Grouped Data," Journal of Business & Economic Statistics, American Statistical Association, vol. 21(1), pages 1-11, January.
    29. Imbens, Guido W & Angrist, Joshua D, 1994. "Identification and Estimation of Local Average Treatment Effects," Econometrica, Econometric Society, vol. 62(2), pages 467-475, March.
    30. Brunner, Karl & Meltzer, Allan H., 1976. "The Phillips curve," Carnegie-Rochester Conference Series on Public Policy, Elsevier, vol. 1(1), pages 1-18, January.
    31. Martin Ravallion, 2009. "Evaluation in the Practice of Development," The World Bank Research Observer, World Bank, vol. 24(1), pages 29-53, March.
    32. Jinyong Hahn, 1998. "On the Role of the Propensity Score in Efficient Semiparametric Estimation of Average Treatment Effects," Econometrica, Econometric Society, vol. 66(2), pages 315-332, March.
    33. Jeffrey I. Steinfeld, 1999. "Book," Journal of Industrial Ecology, Yale University, vol. 3(4), pages 145-147, October.
    34. Wilbert van der Klaauw, 2002. "Estimating the Effect of Financial Aid Offers on College Enrollment: A Regression-Discontinuity Approach," International Economic Review, Department of Economics, University of Pennsylvania and Osaka University Institute of Social and Economic Research Association, vol. 43(4), pages 1249-1287, November.
    35. Li, Qi & Racine, Jeffrey S. & Wooldridge, Jeffrey M., 2009. "Efficient Estimation of Average Treatment Effects with Mixed Categorical and Continuous Data," Journal of Business & Economic Statistics, American Statistical Association, vol. 27(2), pages 206-223.
    36. Alberto Abadie & Guido W. Imbens, 2006. "Large Sample Properties of Matching Estimators for Average Treatment Effects," Econometrica, Econometric Society, vol. 74(1), pages 235-267, January.
    37. Barnard J. & Frangakis C.E. & Hill J.L. & Rubin D.B., 2003. "Principal Stratification Approach to Broken Randomized Experiments: A Case Study of School Choice Vouchers in New York City," Journal of the American Statistical Association, American Statistical Association, vol. 98, pages 299-323, January.
    38. Juan Jose Diaz & Sudhanshu Handa, 2006. "An Assessment of Propensity Score Matching as a Nonexperimental Impact Estimator: Evidence from Mexico’s PROGRESA Program," Journal of Human Resources, University of Wisconsin Press, vol. 41(2).
    39. Deaton, Angus, 1989. "Rice Prices and Income Distribution in Thailand: A Non-parametric Analysis," Economic Journal, Royal Economic Society, vol. 99(395), pages 1-37, Supplemen.
    40. Thomas Fraker & Rebecca Maynard, 1987. "The Adequacy of Comparison Group Designs for Evaluations of Employment-Related Programs," Journal of Human Resources, University of Wisconsin Press, vol. 22(2), pages 194-227.
    Full references (including those not matched with items on IDEAS)

    Citations

    Citations are extracted by the CitEc Project, subscribe to its RSS feed for this item.
    as


    Cited by:

    1. David Alfaro‐Serrano & Tanay Balantrapu & Ritam Chaurey & Ana Goicoechea & Eric Verhoogen, 2021. "Interventions to promote technology adoption in firms: A systematic review," Campbell Systematic Reviews, John Wiley & Sons, vol. 17(4), December.
    2. Gisselquist, Rachel M., 2020. "How the cases you choose affect the answers you get, revisited," World Development, Elsevier, vol. 127(C).
    3. Peters, Jörg & Langbein, Jörg & Roberts, Gareth, 2016. "Policy evaluation, randomized controlled trials, and external validity—A systematic review," Economics Letters, Elsevier, vol. 147(C), pages 51-54.
    4. Koppenberg, Maximilian & Mishra, Ashok K. & Hirsch, Stefan, 2023. "Food aid and violent conflict: A review and Empiricist’s companion," Food Policy, Elsevier, vol. 121(C).
    5. Koppenberg, Maximilian & Mishra, Ashok K. & Hirsch, Stefan, 2023. "Food Aid and Violent Conflict: A Review of Literature," IZA Discussion Papers 16574, Institute of Labor Economics (IZA).
    6. Torm, Nina & Oehme, Marty, 2024. "Social protection and formalization in low- and middle-income countries: A scoping review of the literature," World Development, Elsevier, vol. 181(C).

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Rajeev Dehejia, 2013. "The Porous Dialectic: Experimental and Non-Experimental Methods in Development Economics," WIDER Working Paper Series wp-2013-011, World Institute for Development Economic Research (UNU-WIDER).
    2. Dehejia, Rajeev, 2013. "The Porous Dialectic: Experimental and Non-Experimental Methods in Development Economics," WIDER Working Paper Series 011, World Institute for Development Economic Research (UNU-WIDER).
    3. Guido W. Imbens & Jeffrey M. Wooldridge, 2009. "Recent Developments in the Econometrics of Program Evaluation," Journal of Economic Literature, American Economic Association, vol. 47(1), pages 5-86, March.
    4. Carlos A. Flores & Oscar A. Mitnik, 2009. "Evaluating Nonexperimental Estimators for Multiple Treatments: Evidence from Experimental Data," Working Papers 2010-10, University of Miami, Department of Economics.
    5. Jeffrey Smith & Arthur Sweetman, 2016. "Viewpoint: Estimating the causal effects of policies and programs," Canadian Journal of Economics, Canadian Economics Association, vol. 49(3), pages 871-905, August.
    6. Ferraro, Paul J. & Miranda, Juan José, 2014. "The performance of non-experimental designs in the evaluation of environmental programs: A design-replication study using a large-scale randomized experiment as a benchmark," Journal of Economic Behavior & Organization, Elsevier, vol. 107(PA), pages 344-365.
    7. Huber, Martin, 2019. "An introduction to flexible methods for policy evaluation," FSES Working Papers 504, Faculty of Economics and Social Sciences, University of Freiburg/Fribourg Switzerland.
    8. Huber, Martin & Lechner, Michael & Wunsch, Conny, 2013. "The performance of estimators based on the propensity score," Journal of Econometrics, Elsevier, vol. 175(1), pages 1-21.
    9. Huber, Martin & Lechner, Michael & Wunsch, Conny, 2010. "How to Control for Many Covariates? Reliable Estimators Based on the Propensity Score," IZA Discussion Papers 5268, Institute of Labor Economics (IZA).
    10. Susan Athey & Guido W. Imbens, 2017. "The State of Applied Econometrics: Causality and Policy Evaluation," Journal of Economic Perspectives, American Economic Association, vol. 31(2), pages 3-32, Spring.
    11. Steven Lehrer & Gregory Kordas, 2013. "Matching using semiparametric propensity scores," Empirical Economics, Springer, vol. 44(1), pages 13-45, February.
    12. Sant’Anna, Pedro H.C. & Song, Xiaojun, 2019. "Specification tests for the propensity score," Journal of Econometrics, Elsevier, vol. 210(2), pages 379-404.
    13. Ravallion, Martin, 2008. "Evaluating Anti-Poverty Programs," Handbook of Development Economics, in: T. Paul Schultz & John A. Strauss (ed.), Handbook of Development Economics, edition 1, volume 4, chapter 59, pages 3787-3846, Elsevier.
    14. Justine Burns & Malcolm Kewsell & Rebecca Thornton, 2009. "Evaluating the Impact of Health Programmes," SALDRU Working Papers 40, Southern Africa Labour and Development Research Unit, University of Cape Town.
    15. Chad D. Meyerhoefer & Muzhe Yang, 2011. "The Relationship between Food Assistance and Health: A Review of the Literature and Empirical Strategies for Identifying Program Effects," Applied Economic Perspectives and Policy, Agricultural and Applied Economics Association, vol. 33(3), pages 304-344.
    16. Peter R. Mueser & Kenneth R. Troske & Alexey Gorislavsky, 2007. "Using State Administrative Data to Measure Program Performance," The Review of Economics and Statistics, MIT Press, vol. 89(4), pages 761-783, November.
    17. Richard Blundell & Monica Costa Dias, 2009. "Alternative Approaches to Evaluation in Empirical Microeconomics," Journal of Human Resources, University of Wisconsin Press, vol. 44(3).
    18. Arun Advani & Toru Kitagawa & Tymon Słoczyński, 2019. "Mostly harmless simulations? Using Monte Carlo studies for estimator selection," Journal of Applied Econometrics, John Wiley & Sons, Ltd., vol. 34(6), pages 893-910, September.
    19. Advani, Arun & Sloczynski, Tymon, 2013. "Mostly Harmless Simulations? On the Internal Validity of Empirical Monte Carlo Studies," IZA Discussion Papers 7874, Institute of Labor Economics (IZA).
    20. Lechner, Michael & Wunsch, Conny, 2013. "Sensitivity of matching-based program evaluations to the availability of control variables," Labour Economics, Elsevier, vol. 21(C), pages 111-121.

    More about this item

    Keywords

    external validity; program evaluation; randomized controlled trials observational studies;
    All these keywords.

    JEL classification:

    • C18 - Mathematical and Quantitative Methods - - Econometric and Statistical Methods and Methodology: General - - - Methodolical Issues: General
    • C52 - Mathematical and Quantitative Methods - - Econometric Modeling - - - Model Evaluation, Validation, and Selection
    • O12 - Economic Development, Innovation, Technological Change, and Growth - - Economic Development - - - Microeconomic Analyses of Economic Development

    Statistics

    Access and download statistics

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:bpj:globdv:v:6:y:2015:i:1:p:47-69:n:1. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: Peter Golla (email available below). General contact details of provider: https://www.degruyter.com .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.