IDEAS home Printed from https://ideas.repec.org/p/nbr/nberwo/19742.html
   My bibliography  Save this paper

Endogenous Stratification in Randomized Experiments

Author

Listed:
  • Alberto Abadie
  • Matthew M. Chingos
  • Martin R. West

Abstract

Researchers and policy makers are often interested in estimating how treatments or policy interventions affect the outcomes of those most in need of help. This concern has motivated the increasingly common practice of disaggregating experimental data by groups constructed on the basis of an index of baseline characteristics that predicts the values that individual outcomes would take on in the absence of the treatment. This article shows that substantial biases may arise in practice if the index is estimated, as is often the case, by regressing the outcome variable on baseline characteristics for the full sample of experimental controls. We analyze the behavior of leave-one-out and repeated split sample estimators and show they behave well in realistic scenarios, correcting the large bias problem of the full sample estimator. We use data from the National JTPA Study and the Tennessee STAR experiment to demonstrate the performance of alternative estimators and the magnitude of their biases.

Suggested Citation

  • Alberto Abadie & Matthew M. Chingos & Martin R. West, 2013. "Endogenous Stratification in Randomized Experiments," NBER Working Papers 19742, National Bureau of Economic Research, Inc.
  • Handle: RePEc:nbr:nberwo:19742
    Note: CH DEV ED EH LS PE TWP
    as

    Download full text from publisher

    File URL: http://www.nber.org/papers/w19742.pdf
    Download Restriction: no
    ---><---

    Other versions of this item:

    References listed on IDEAS

    as
    1. James J. Heckman & Jeffrey Smith & Nancy Clements, 1997. "Making The Most Out Of Programme Evaluations and Social Experiments: Accounting For Heterogeneity in Programme Impacts," The Review of Economic Studies, Review of Economic Studies Ltd, vol. 64(4), pages 487-535.
    2. Joshua D. Angrist & Jörn-Steffen Pischke, 2010. "The Credibility Revolution in Empirical Economics: How Better Research Design Is Taking the Con out of Econometrics," Journal of Economic Perspectives, American Economic Association, vol. 24(2), pages 3-30, Spring.
    3. Lisa Sanbonmatsu & Jeffrey R. Kling & Greg J. Duncan & Jeanne Brooks-Gunn, 2006. "Neighborhoods and Academic Achievement: Results from the Moving to Opportunity Experiment," Journal of Human Resources, University of Wisconsin Press, vol. 41(4).
    4. Susan Dynarski & Joshua Hyman & Diane Whitmore Schanzenbach, 2013. "Experimental Evidence on the Effect of Childhood Investments on Postsecondary Attainment and Degree Completion," Journal of Policy Analysis and Management, John Wiley & Sons, Ltd., vol. 32(4), pages 692-717, September.
    5. Rodríguez-Planas, Núria, 2012. "School and Drugs: Closing the Gap - Evidence from a Randomized Trial in the US," IZA Discussion Papers 6770, Institute of Labor Economics (IZA).
    6. Joshua Angrist & Victor Lavy, 2009. "The Effects of High Stakes High School Achievement Awards: Evidence from a Randomized Trial," American Economic Review, American Economic Association, vol. 99(4), pages 1384-1414, September.
    7. Douglas N. Harris & Sara Goldrick-Rab, 2012. "Improving the Productivity of Education Experiments: Lessons from a Randomized Study of Need-Based Financial Aid," Education Finance and Policy, MIT Press, vol. 7(2), pages 143-169, March.
    Full references (including those not matched with items on IDEAS)

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Burt S. Barnow & Jeffrey Smith, 2015. "Employment and Training Programs," NBER Chapters, in: Economics of Means-Tested Transfer Programs in the United States, Volume 2, pages 127-234, National Bureau of Economic Research, Inc.
    2. Scott Carrell & Bruce Sacerdote, 2017. "Why Do College-Going Interventions Work?," American Economic Journal: Applied Economics, American Economic Association, vol. 9(3), pages 124-151, July.
    3. Stephen Gibbons & Olmo Silva & Felix Weinhardt, 2013. "Everybody Needs Good Neighbours? Evidence from Students’ Outcomes in England," Economic Journal, Royal Economic Society, vol. 123, pages 831-874, September.
    4. Rosa Sanchis-Guarner & José Montalbán & Felix Weinhardt, 2021. "Home Broadband and Human Capital Formation," CESifo Working Paper Series 8846, CESifo.
    5. Judith Favereau & Nicolas Brisset, 2016. "Randomization of What? Moving from Libertarian to "Democratic Paternalism"," GREDEG Working Papers 2016-34, Groupe de REcherche en Droit, Economie, Gestion (GREDEG CNRS), Université Côte d'Azur, France.
    6. Stephen Gibbons & Olmo Silva & Felix Weinhardt, 2010. "Do Neighbours Affect Teenage Outcomes? Evidence from Neighbourhood Changes in England," CEE Discussion Papers 0122, Centre for the Economics of Education, LSE.
    7. Ayako Wakano & Hiroyuki Yamada & Daichi Shimamoto, 2017. "Does the Heterogeneity of Project Implementers Affect the Programme Participation of Beneficiaries?: Evidence from Rural Cambodia," Journal of Development Studies, Taylor & Francis Journals, vol. 53(1), pages 49-67, January.
    8. Eric D. Gould & Victor Lavy & M. Daniele Paserman, 2009. "Does Immigration Affect the Long-Term Educational Outcomes of Natives? Quasi-Experimental Evidence," Economic Journal, Royal Economic Society, vol. 119(540), pages 1243-1269, October.
    9. Jens Ludwig & Jeffrey R. Kling & Sendhil Mullainathan, 2011. "Mechanism Experiments and Policy Evaluations," Journal of Economic Perspectives, American Economic Association, vol. 25(3), pages 17-38, Summer.
    10. Heckman, James J. & Humphries, John Eric & Veramendi, Gregory, 2016. "Dynamic treatment effects," Journal of Econometrics, Elsevier, vol. 191(2), pages 276-292.
    11. Angus Deaton, 2010. "Instruments, Randomization, and Learning about Development," Journal of Economic Literature, American Economic Association, vol. 48(2), pages 424-455, June.
    12. Kugler Franziska & Schwerdt Guido & Wößmann Ludger, 2014. "Ökonometrische Methoden zur Evaluierung kausaler Effekte der Wirtschaftspolitik," Perspektiven der Wirtschaftspolitik, De Gruyter, vol. 15(2), pages 105-132, June.
    13. Judith Favereau & Nicolas Brisset, 2016. "Randomization of What? Moving from Libertarian to "Democratic Paternalism". GREDEG Working Papers Series," Working Papers hal-02092638, HAL.
    14. James J. Heckman, 2010. "Building Bridges between Structural and Program Evaluation Approaches to Evaluating Policy," Journal of Economic Literature, American Economic Association, vol. 48(2), pages 356-398, June.
    15. Dionissi Aliprantis, 2017. "Assessing the evidence on neighborhood effects from Moving to Opportunity," Empirical Economics, Springer, vol. 52(3), pages 925-954, May.
    16. Angus Deaton, 2009. "Instruments of development: Randomization in the tropics, and the search for the elusive keys to economic development," Working Papers 1128, Princeton University, Woodrow Wilson School of Public and International Affairs, Center for Health and Wellbeing..
    17. Cassidy, Michael T., 2020. "A Closer Look: Proximity Boosts Homeless Student Performance in New York City," IZA Discussion Papers 13558, Institute of Labor Economics (IZA).
    18. Czajkowski, Mikołaj & Zagórska, Katarzyna & Letki, Natalia & Tryjanowski, Piotr & Wąs, Adam, 2021. "Drivers of farmers’ willingness to adopt extensive farming practices in a globally important bird area," Land Use Policy, Elsevier, vol. 107(C).
    19. Arnaud Chevalier & Colm Harmon & Vincent O’ Sullivan & Ian Walker, 2013. "The impact of parental income and education on the schooling of their children," IZA Journal of Labor Economics, Springer;Forschungsinstitut zur Zukunft der Arbeit GmbH (IZA), vol. 2(1), pages 1-22, December.
    20. Ismaël Mourifié & Marc Henry & Romuald Méango, 2020. "Sharp Bounds and Testability of a Roy Model of STEM Major Choices," Journal of Political Economy, University of Chicago Press, vol. 128(8), pages 3220-3283.

    More about this item

    JEL classification:

    • C01 - Mathematical and Quantitative Methods - - General - - - Econometrics
    • C21 - Mathematical and Quantitative Methods - - Single Equation Models; Single Variables - - - Cross-Sectional Models; Spatial Models; Treatment Effect Models
    • C9 - Mathematical and Quantitative Methods - - Design of Experiments

    NEP fields

    This paper has been announced in the following NEP Reports:

    Statistics

    Access and download statistics

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:nbr:nberwo:19742. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: the person in charge (email available below). General contact details of provider: https://edirc.repec.org/data/nberrus.html .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.