IDEAS home Printed from https://ideas.repec.org/p/tin/wpaper/19970024.html
   My bibliography  Save this paper

Bolivian Social Investment Fund Analysis of Baseline Data for Impact Evaluation

Author

Listed:
  • Menno Pradhan

    (Vrije Universiteit Amsterdam)

  • Laura Rawlings

    (The World Bank)

  • Geert Ridder

    (Vrije Universiteit Amsterdam)

Abstract

This discussion paper resulted in an article in the World Bank Economic Review (1998). Volume 12, issue 3, pages 457-483. On the basis of the baseline data collected for the evaluation of the Bolivian Social Investment Fund (SIF) this paper assesses (1) the benefit incidence of the SIF and (2) the quality of the evaluation design. We find that the benefits in education are most equally distributed over the population, the investments in health and sanitation favor those relatively well off. For the education component of the SIF, control groups of schools which will not receive benefits have been included in the survey. In one region theseschools where selected on the basis of matched comparison on the basis of observed characteristics, in the other region by means of randomization. We compare control and treatment groups and conclude there is a systematic bias in favor of treatment schools in the first region. We propose to use instrumental variables to control for the non-random selection. With the pre-intervention data we can test whether an instrument is valid. We find that among several candidates the number of NGOs (non governmental organizations)in the community is a valid instrument. Next, we investigate the possible loss of efficiency in the estimate of the impact due to the non experimental control group design.

Suggested Citation

  • Menno Pradhan & Laura Rawlings & Geert Ridder, 1997. "Bolivian Social Investment Fund Analysis of Baseline Data for Impact Evaluation," Tinbergen Institute Discussion Papers 97-024/4, Tinbergen Institute.
  • Handle: RePEc:tin:wpaper:19970024
    as

    Download full text from publisher

    File URL: https://papers.tinbergen.nl/97024.pdf
    Download Restriction: no
    ---><---

    References listed on IDEAS

    as
    1. Ashenfelter, Orley C, 1978. "Estimating the Effect of Training Programs on Earnings," The Review of Economics and Statistics, MIT Press, vol. 60(1), pages 47-57, February.
    2. Angrist, Joshua D, 1990. "Lifetime Earnings and the Vietnam Era Draft Lottery: Evidence from Social Security Administrative Records," American Economic Review, American Economic Association, vol. 80(3), pages 313-336, June.
    3. Imbens, Guido W & Angrist, Joshua D, 1994. "Identification and Estimation of Local Average Treatment Effects," Econometrica, Econometric Society, vol. 62(2), pages 467-475, March.
    4. Heckman, J.J. & Hotz, V.J., 1988. "Choosing Among Alternative Nonexperimental Methods For Estimating The Impact Of Social Programs: The Case Of Manpower Training," University of Chicago - Economics Research Center 88-12, Chicago - Economics Research Center.
    5. LaLonde, Robert J, 1986. "Evaluating the Econometric Evaluations of Training Programs with Experimental Data," American Economic Review, American Economic Association, vol. 76(4), pages 604-620, September.
    6. Joshua Angrist, 1989. "Lifetime Earnings and the Vietnam Era Draft Lottery: Evidence from Social Security Administrative Records," Working Papers 631, Princeton University, Department of Economics, Industrial Relations Section..
    7. Jimenez, Emmanuel, 1995. "Human and physical infrastructure: Public investment and pricing policies in developing countries," Handbook of Development Economics, in: Hollis Chenery & T.N. Srinivasan (ed.), Handbook of Development Economics, edition 1, volume 3, chapter 43, pages 2773-2843, Elsevier.
    Full references (including those not matched with items on IDEAS)

    Citations

    Citations are extracted by the CitEc Project, subscribe to its RSS feed for this item.
    as


    Cited by:

    1. Bouillon, César P. & Tejerina, Luis, 2006. "Do We Know What Works?: A Systematic Review of Impact Evaluations of Social Programs in Latin America and the Caribbean. Latest version," IDB Publications (Working Papers) 4297, Inter-American Development Bank.
    2. repec:idb:brikps:340 is not listed on IDEAS
    3. César P. Bouillon & Luis Tejerina, 2006. "Do We Know What Works?: A Systematic Review of Impact Evaluations of Social Programs in Latin America and the Caribbean," IDB Publications (Working Papers) 80443, Inter-American Development Bank.

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Guido W. Imbens & Jeffrey M. Wooldridge, 2009. "Recent Developments in the Econometrics of Program Evaluation," Journal of Economic Literature, American Economic Association, vol. 47(1), pages 5-86, March.
    2. A. Smith, Jeffrey & E. Todd, Petra, 2005. "Does matching overcome LaLonde's critique of nonexperimental estimators?," Journal of Econometrics, Elsevier, vol. 125(1-2), pages 305-353.
    3. Peter Hull & Michal Kolesár & Christopher Walters, 2022. "Labor by design: contributions of David Card, Joshua Angrist, and Guido Imbens," Scandinavian Journal of Economics, Wiley Blackwell, vol. 124(3), pages 603-645, July.
    4. Richard Blundell & Monica Costa Dias, 2009. "Alternative Approaches to Evaluation in Empirical Microeconomics," Journal of Human Resources, University of Wisconsin Press, vol. 44(3).
    5. Joshua D. Angrist, 2022. "Empirical Strategies in Economics: Illuminating the Path From Cause to Effect," Econometrica, Econometric Society, vol. 90(6), pages 2509-2539, November.
    6. Michael Lechner, 2000. "An Evaluation of Public-Sector-Sponsored Continuous Vocational Training Programs in East Germany," Journal of Human Resources, University of Wisconsin Press, vol. 35(2), pages 347-375.
    7. Deborah A. Cobb‐Clark & Thomas Crossley, 2003. "Econometrics for Evaluations: An Introduction to Recent Developments," The Economic Record, The Economic Society of Australia, vol. 79(247), pages 491-511, December.
    8. Thomas Brodaty & Bruno Crépon & Denis Fougère, 2007. "Les méthodes micro-économétriques d'évaluation et leurs applications aux politiques actives de l'emploi," Economie & Prévision, La Documentation Française, vol. 0(1), pages 93-118.
    9. van der Klaauw, Bas, 2014. "From micro data to causality: Forty years of empirical labor economics," Labour Economics, Elsevier, vol. 30(C), pages 88-97.
    10. Heckman, James J. & Lalonde, Robert J. & Smith, Jeffrey A., 1999. "The economics and econometrics of active labor market programs," Handbook of Labor Economics, in: O. Ashenfelter & D. Card (ed.), Handbook of Labor Economics, edition 1, volume 3, chapter 31, pages 1865-2097, Elsevier.
    11. Justine Burns & Malcolm Kewsell & Rebecca Thornton, 2009. "Evaluating the Impact of Health Programmes," SALDRU Working Papers 40, Southern Africa Labour and Development Research Unit, University of Cape Town.
    12. Ozkan Eren & Serkan Ozbeklik, 2014. "Who Benefits From Job Corps? A Distributional Analysis Of An Active Labor Market Program," Journal of Applied Econometrics, John Wiley & Sons, Ltd., vol. 29(4), pages 586-611, June.
    13. Robert J. LaLonde, 2003. "Employment and Training Programs," NBER Chapters, in: Means-Tested Transfer Programs in the United States, pages 517-586, National Bureau of Economic Research, Inc.
    14. Regner, Hakan, 2002. "A nonexperimental evaluation of training programs for the unemployed in Sweden," Labour Economics, Elsevier, vol. 9(2), pages 187-206, April.
    15. Greenstone, Michael & Gayer, Ted, 2009. "Quasi-experimental and experimental approaches to environmental economics," Journal of Environmental Economics and Management, Elsevier, vol. 57(1), pages 21-44, January.
    16. Joshua D. Angrist & Jörn-Steffen Pischke, 2010. "The Credibility Revolution in Empirical Economics: How Better Research Design Is Taking the Con out of Econometrics," Journal of Economic Perspectives, American Economic Association, vol. 24(2), pages 3-30, Spring.
    17. Guido W. Imbens, 2022. "Causality in Econometrics: Choice vs Chance," Econometrica, Econometric Society, vol. 90(6), pages 2541-2566, November.
    18. John DiNardo & David S. Lee, 2010. "Program Evaluation and Research Designs," Working Papers 1228, Princeton University, Department of Economics, Industrial Relations Section..
    19. Denis Fougère & Nicolas Jacquemet, 2020. "Policy Evaluation Using Causal Inference Methods," SciencePo Working papers Main hal-03455978, HAL.
    20. Markus Frölich, 2004. "Programme Evaluation with Multiple Treatments," Journal of Economic Surveys, Wiley Blackwell, vol. 18(2), pages 181-224, April.

    More about this item

    Statistics

    Access and download statistics

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:tin:wpaper:19970024. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: Tinbergen Office +31 (0)10-4088900 (email available below). General contact details of provider: https://edirc.repec.org/data/tinbenl.html .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.