IDEAS home Printed from https://ideas.repec.org/a/ucp/jlabec/doi10.1086-691726.html
   My bibliography  Save this article

Assessing the Performance of Nonexperimental Estimators for Evaluating Head Start

Author

Listed:
  • Andrew S. Griffen
  • Petra E. Todd

Abstract

This paper uses experimental data from the Head Start Impact Study (HSIS) combined with nonexperimental data from the Early Childhood Longitudinal Study–Birth Cohort (ECLS-B) to study the performance of nonexperimental estimators for evaluating Head Start program impacts. The estimators studied include parametric cross-section and difference-in-differences regression estimators and nonparametric cross-section and difference-in-differences matching estimators. The estimators are used to generate program impacts on cognitive achievement test scores, child health measures, parenting behaviors, and parent labor market outcomes. Some of the estimators closely reproduce the experimental results, but a priori it would be difficult to know whether the estimator works well for any particular outcome. Pre-program exogeneity tests eliminate some outcomes and estimators with the worst biases, but estimators/outcomes with substantial biases pass the tests. The difference-in-differences matching estimator exhibits the best performance in terms of low bias values and capturing the pattern of statistically significant treatment effects. However, the variation in bias is greater across outcomes examined than across methods.

Suggested Citation

  • Andrew S. Griffen & Petra E. Todd, 2017. "Assessing the Performance of Nonexperimental Estimators for Evaluating Head Start," Journal of Labor Economics, University of Chicago Press, vol. 35(S1), pages 7-63.
  • Handle: RePEc:ucp:jlabec:doi:10.1086/691726
    DOI: 10.1086/691726
    as

    Download full text from publisher

    File URL: http://dx.doi.org/10.1086/691726
    Download Restriction: Access to the online full text or PDF requires a subscription.

    File URL: http://dx.doi.org/10.1086/691726
    Download Restriction: Access to the online full text or PDF requires a subscription.

    File URL: https://libkey.io/10.1086/691726?utm_source=ideas
    LibKey link: if access is restricted and if your library uses this service, LibKey will redirect you to where you can use your library subscription to access this item
    ---><---

    As the access to this document is restricted, you may want to search for a different version of it.

    Citations

    Citations are extracted by the CitEc Project, subscribe to its RSS feed for this item.
    as


    Cited by:

    1. Sauermann, Jan & Stenberg, Anders, 2020. "Assessing Selection Bias in Non-Experimental Estimates of the Returns to Workplace Training," IZA Discussion Papers 13789, Institute of Labor Economics (IZA).
    2. Chan, M. & Dalla-Zuanna, A., 2023. "Understanding Program Complementarities: Estimating the Dynamic Effects of Head Start with Multiple Alternatives," Cambridge Working Papers in Economics 2330, Faculty of Economics, University of Cambridge.
    3. David Rhys Bernard & Gharad Bryan & Sylvain Chabé-Ferret & Jonathan de Quidt & Jasmin Claire Fliegner & Roland Rathelot, 2023. "How Much Should We Trust Observational Estimates? Accumulating Evidence Using Randomized Controlled Trials with Imperfect Compliance," Working Papers 976, Queen Mary University of London, School of Economics and Finance.

    More about this item

    Statistics

    Access and download statistics

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:ucp:jlabec:doi:10.1086/691726. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    We have no bibliographic references for this item. You can help adding them by using this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: Journals Division (email available below). General contact details of provider: https://www.journals.uchicago.edu/JOLE .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.