IDEAS home Printed from https://ideas.repec.org/p/foi/wpaper/2011_16.html
   My bibliography  Save this paper

A Comparison of Model-based and Design-based Impact Evaluations of Interventions in Developing Countries

Author

Listed:
  • Henrik Hansen

    (Institute of Food and Resource Economics, University of Copenhagen)

  • Ninja Ritter Klejnstrup

    (Evaluation Department, Ministry of Foreign Affairs of Denmark, Danida)

  • Ole Winckler Andersen

    (Evaluation Department, Ministry of Foreign Affairs of Denmark, Danida)

Abstract

We argue that non-experimental impact estimators will continue to be needed for evaluations of interventions in developing countries as social experiments, for various reasons, will never be the most preferred approach. In a survey of four studies that empirically compare the performance of experimental and non-experimental impact estimates using data from development interventions, we show that the preferred non-experimental estimators are unbiased. We try to explain the reasons why the non-experimental estimators perform better in the context of development interventions than American job-market interventions. We also use the survey as a source for suggestions for implementation and assessment of non-experimental impact evaluations. Our main suggestion is to be more careful and precise in the formulation of the statistical model for the assignment into the program and also to use the assignment information for model-based systematic sampling.

Suggested Citation

  • Henrik Hansen & Ninja Ritter Klejnstrup & Ole Winckler Andersen, 2011. "A Comparison of Model-based and Design-based Impact Evaluations of Interventions in Developing Countries," IFRO Working Paper 2011/16, University of Copenhagen, Department of Food and Resource Economics.
  • Handle: RePEc:foi:wpaper:2011_16
    as

    Download full text from publisher

    File URL: http://okonomi.foi.dk/workingpapers/WPpdf/WP2011/WP_2011_16_model_vs_design.pdf
    Download Restriction: no
    ---><---

    References listed on IDEAS

    as
    1. Angus Deaton, 2010. "Instruments, Randomization, and Learning about Development," Journal of Economic Literature, American Economic Association, vol. 48(2), pages 424-455, June.
    2. James J. Heckman & Vytlacil, Edward J., 2007. "Econometric Evaluation of Social Programs, Part II: Using the Marginal Treatment Effect to Organize Alternative Econometric Estimators to Evaluate Social Programs, and to Forecast their Effects in New," Handbook of Econometrics, in: J.J. Heckman & E.E. Leamer (ed.), Handbook of Econometrics, edition 1, volume 6, chapter 71, Elsevier.
    3. Andrew E. Clark, 2003. "Unemployment as a Social Norm: Psychological Evidence from Panel Data," Journal of Labor Economics, University of Chicago Press, vol. 21(2), pages 289-322, April.
    4. Steven Glazerman & Dan M. Levy & David Myers, 2003. "Nonexperimental Versus Experimental Estimates of Earnings Impacts," The ANNALS of the American Academy of Political and Social Science, , vol. 589(1), pages 63-93, September.
    5. Alois Stutzer & Rafael Lalive, 2004. "The Role of Social Work Norms in Job Searching and Subjective Well-Being," Journal of the European Economic Association, MIT Press, vol. 2(4), pages 696-719, June.
    6. Hausman, Jerry, 2015. "Specification tests in econometrics," Applied Econometrics, Russian Presidential Academy of National Economy and Public Administration (RANEPA), vol. 38(2), pages 112-134.
    7. Dehejia, Rajeev, 2005. "Practical propensity score matching: a reply to Smith and Todd," Journal of Econometrics, Elsevier, vol. 125(1-2), pages 355-364.
    8. David McKenzie & John Gibson & Steven Stillman, 2010. "How Important Is Selection? Experimental vs. Non-Experimental Measures of the Income Gains from Migration," Journal of the European Economic Association, MIT Press, vol. 8(4), pages 913-945, June.
    9. Glewwe, Paul & Kremer, Michael & Moulin, Sylvie & Zitzewitz, Eric, 2004. "Retrospective vs. prospective analyses of school inputs: the case of flip charts in Kenya," Journal of Development Economics, Elsevier, vol. 74(1), pages 251-268, June.
    10. A. Smith, Jeffrey & E. Todd, Petra, 2005. "Does matching overcome LaLonde's critique of nonexperimental estimators?," Journal of Econometrics, Elsevier, vol. 125(1-2), pages 305-353.
    11. Ravallion, Martin, 2008. "Evaluating Anti-Poverty Programs," Handbook of Development Economics, in: T. Paul Schultz & John A. Strauss (ed.), Handbook of Development Economics, edition 1, volume 4, chapter 59, pages 3787-3846, Elsevier.
    12. Todd, Petra E., 2008. "Evaluating Social Programs with Endogenous Program Placement and Selection of the Treated," Handbook of Development Economics, in: T. Paul Schultz & John A. Strauss (ed.), Handbook of Development Economics, edition 1, volume 4, chapter 60, pages 3847-3894, Elsevier.
    13. Christopher B. Barrett & Michael R. Carter, 2010. "The Power and Pitfalls of Experiments in Development Economics: Some Non-random Reflections," Applied Economic Perspectives and Policy, Agricultural and Applied Economics Association, vol. 32(4), pages 515-548.
    14. James Heckman & Hidehiko Ichimura & Jeffrey Smith & Petra Todd, 1998. "Characterizing Selection Bias Using Experimental Data," Econometrica, Econometric Society, vol. 66(5), pages 1017-1098, September.
    15. Guido W. Imbens & Jeffrey M. Wooldridge, 2009. "Recent Developments in the Econometrics of Program Evaluation," Journal of Economic Literature, American Economic Association, vol. 47(1), pages 5-86, March.
    16. Elster, Jon, 1989. "Social Norms and Economic Theory," Journal of Economic Perspectives, American Economic Association, vol. 3(4), pages 99-117, Fall.
    17. Thomas D. Cook & William R. Shadish & Vivian C. Wong, 2008. "Three conditions under which experiments and observational studies produce comparable causal estimates: New findings from within-study comparisons," Journal of Policy Analysis and Management, John Wiley & Sons, Ltd., vol. 27(4), pages 724-750.
    18. David H. Greenberg & Charles Michalopoulos & Philip K. Robin, 2006. "Do experimental and nonexperimental evaluations give different answers about the effectiveness of government-funded training programs?," Journal of Policy Analysis and Management, John Wiley & Sons, Ltd., vol. 25(3), pages 523-552.
    19. Heckman, James, 2013. "Sample selection bias as a specification error," Applied Econometrics, Russian Presidential Academy of National Economy and Public Administration (RANEPA), vol. 31(3), pages 129-137.
    20. Juan Jose Diaz & Sudhanshu Handa, 2006. "An Assessment of Propensity Score Matching as a Nonexperimental Impact Estimator: Evidence from Mexico’s PROGRESA Program," Journal of Human Resources, University of Wisconsin Press, vol. 41(2).
    21. Thomas Fraker & Rebecca Maynard, 1987. "The Adequacy of Comparison Group Designs for Evaluations of Employment-Related Programs," Journal of Human Resources, University of Wisconsin Press, vol. 22(2), pages 194-227.
    22. LaLonde, Robert J, 1986. "Evaluating the Econometric Evaluations of Training Programs with Experimental Data," American Economic Review, American Economic Association, vol. 76(4), pages 604-620, September.
    23. Leamer, Edward E, 1983. "Let's Take the Con Out of Econometrics," American Economic Review, American Economic Association, vol. 73(1), pages 31-43, March.
    24. Rajeev H. Dehejia & Sadek Wahba, 1998. "Causal Effects in Non-Experimental Studies: Re-Evaluating the Evaluation of Training Programs," NBER Working Papers 6586, National Bureau of Economic Research, Inc.
    25. Sudhanshu Handa & John A. Maluccio, 2010. "Matching the Gold Standard: Comparing Experimental and Nonexperimental Evaluation Techniques for a Geographically Targeted Program," Economic Development and Cultural Change, University of Chicago Press, vol. 58(3), pages 415-447, April.
    26. James J. Heckman & Vytlacil, Edward J., 2007. "Econometric Evaluation of Social Programs, Part I: Causal Models, Structural Models and Econometric Policy Evaluation," Handbook of Econometrics, in: J.J. Heckman & E.E. Leamer (ed.), Handbook of Econometrics, edition 1, volume 6, chapter 70, Elsevier.
    27. repec:mpr:mprres:3694 is not listed on IDEAS
    28. Buddelmeyer, Hielke & Skoufias, Emmanuel, 2003. "An Evaluation of the Performance of Regression Discontinuity Design on PROGRESA," IZA Discussion Papers 827, Institute of Labor Economics (IZA).
    29. Heckman, James J. & Robb, Richard Jr., 1985. "Alternative methods for evaluating the impact of interventions : An overview," Journal of Econometrics, Elsevier, vol. 30(1-2), pages 239-267.
    30. Paul J. Gertler & Sebastian Martinez & Patrick Premand & Laura B. Rawlings & Christel M. J. Vermeersch, . "Impact Evaluation in Practice, First Edition [La evaluación de impacto en la práctica]," World Bank Publications, The World Bank, number 2550, September.
    Full references (including those not matched with items on IDEAS)

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Ferraro, Paul J. & Miranda, Juan José, 2014. "The performance of non-experimental designs in the evaluation of environmental programs: A design-replication study using a large-scale randomized experiment as a benchmark," Journal of Economic Behavior & Organization, Elsevier, vol. 107(PA), pages 344-365.
    2. Vivian C. Wong & Peter M. Steiner & Kylie L. Anglin, 2018. "What Can Be Learned From Empirical Evaluations of Nonexperimental Methods?," Evaluation Review, , vol. 42(2), pages 147-175, April.
    3. Guido W. Imbens & Jeffrey M. Wooldridge, 2009. "Recent Developments in the Econometrics of Program Evaluation," Journal of Economic Literature, American Economic Association, vol. 47(1), pages 5-86, March.
    4. Ravallion, Martin, 2008. "Evaluating Anti-Poverty Programs," Handbook of Development Economics, in: T. Paul Schultz & John A. Strauss (ed.), Handbook of Development Economics, edition 1, volume 4, chapter 59, pages 3787-3846, Elsevier.
    5. Justine Burns & Malcolm Kewsell & Rebecca Thornton, 2009. "Evaluating the Impact of Health Programmes," SALDRU Working Papers 40, Southern Africa Labour and Development Research Unit, University of Cape Town.
    6. Duflo, Esther & Glennerster, Rachel & Kremer, Michael, 2008. "Using Randomization in Development Economics Research: A Toolkit," Handbook of Development Economics, in: T. Paul Schultz & John A. Strauss (ed.), Handbook of Development Economics, edition 1, volume 4, chapter 61, pages 3895-3962, Elsevier.
    7. Susan Athey & Guido W. Imbens, 2017. "The State of Applied Econometrics: Causality and Policy Evaluation," Journal of Economic Perspectives, American Economic Association, vol. 31(2), pages 3-32, Spring.
    8. Hugh Sharma Waddington & Paul Fenton Villar & Jeffrey C. Valentine, 2023. "Can Non-Randomised Studies of Interventions Provide Unbiased Effect Estimates? A Systematic Review of Internal Replication Studies," Evaluation Review, , vol. 47(3), pages 563-593, June.
    9. Katherine Baicker & Theodore Svoronos, 2019. "Testing the Validity of the Single Interrupted Time Series Design," NBER Working Papers 26080, National Bureau of Economic Research, Inc.
    10. James J. Heckman, 1991. "Randomization and Social Policy Evaluation Revisited," NBER Technical Working Papers 0107, National Bureau of Economic Research, Inc.
    11. Sloczynski, Tymon, 2018. "A General Weighted Average Representation of the Ordinary and Two-Stage Least Squares Estimands," IZA Discussion Papers 11866, Institute of Labor Economics (IZA).
    12. A. Smith, Jeffrey & E. Todd, Petra, 2005. "Does matching overcome LaLonde's critique of nonexperimental estimators?," Journal of Econometrics, Elsevier, vol. 125(1-2), pages 305-353.
    13. Jeffrey Smith & Arthur Sweetman, 2016. "Viewpoint: Estimating the causal effects of policies and programs," Canadian Journal of Economics, Canadian Economics Association, vol. 49(3), pages 871-905, August.
    14. Tymon S{l}oczy'nski, 2018. "Interpreting OLS Estimands When Treatment Effects Are Heterogeneous: Smaller Groups Get Larger Weights," Papers 1810.01576, arXiv.org, revised May 2020.
    15. Jones A.M & Rice N, 2009. "Econometric Evaluation of Health Policies," Health, Econometrics and Data Group (HEDG) Working Papers 09/09, HEDG, c/o Department of Economics, University of York.
    16. Zhao, Zhong, 2008. "Sensitivity of propensity score methods to the specifications," Economics Letters, Elsevier, vol. 98(3), pages 309-319, March.
    17. Yonatan Eyal, 2020. "Self-Assessment Variables as a Source of Information in the Evaluation of Intervention Programs: A Theoretical and Methodological Framework," SAGE Open, , vol. 10(1), pages 21582440198, January.
    18. Black, Dan A. & Joo, Joonhwi & LaLonde, Robert & Smith, Jeffrey A. & Taylor, Evan J., 2022. "Simple Tests for Selection: Learning More from Instrumental Variables," Labour Economics, Elsevier, vol. 79(C).
    19. James J. Heckman & Petra E. Todd, 2009. "A note on adapting propensity score matching and selection models to choice based samples," Econometrics Journal, Royal Economic Society, vol. 12(s1), pages 230-234, January.
    20. Carlos A. Flores & Oscar A. Mitnik, 2009. "Evaluating Nonexperimental Estimators for Multiple Treatments: Evidence from Experimental Data," Working Papers 2010-10, University of Miami, Department of Economics.

    More about this item

    Keywords

    Development; impact; non-experimental; social experiment; within-study;
    All these keywords.

    JEL classification:

    • C21 - Mathematical and Quantitative Methods - - Single Equation Models; Single Variables - - - Cross-Sectional Models; Spatial Models; Treatment Effect Models
    • C93 - Mathematical and Quantitative Methods - - Design of Experiments - - - Field Experiments
    • H43 - Public Economics - - Publicly Provided Goods - - - Project Evaluation; Social Discount Rate
    • O22 - Economic Development, Innovation, Technological Change, and Growth - - Development Planning and Policy - - - Project Analysis

    Statistics

    Access and download statistics

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:foi:wpaper:2011_16. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: Geir Tveit (email available below). General contact details of provider: https://edirc.repec.org/data/foikudk.html .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.