IDEAS home Printed from https://ideas.repec.org/a/sae/evarev/v41y2017i2p130-154.html
   My bibliography  Save this article

The WWC Attrition Standard

Author

Listed:
  • John Deke
  • Hanley Chiang

Abstract

Background: To limit the influence of attrition bias in assessments of intervention effectiveness, several federal evidence reviews have established a standard for acceptable levels of sample attrition in randomized controlled trials. These evidence reviews include the What Works Clearinghouse (WWC), the Home Visiting Evidence of Effectiveness Review, and the Teen Pregnancy Prevention Evidence Review. We believe the WWC attrition standard may constitute the first use of model-based, empirically supported bounds on attrition bias in the context of a federally sponsored systematic evidence review. Meeting the WWC attrition standard (or one of the attrition standards based on the WWC standard) is now an important consideration for researchers conducting studies that could potentially be reviewed by the WWC (or other evidence reviews). Objectives: The purpose of this article is to explain the WWC attrition model, how that model is used to establish attrition bounds, and to assess the sensitivity of attrition bounds to key parameter values. Research Design: Results are based on equations derived in the article and values generated by applying those equations to a range of parameter values. Results: The authors find that the attrition boundaries are more sensitive to the maximum level of bias that an evidence review is willing to tolerate than to other parameters in the attrition model. Conclusions: The authors conclude that the most productive refinements to existing attrition standards may be with respect to the definition of “maximum tolerable bias.â€

Suggested Citation

  • John Deke & Hanley Chiang, 2017. "The WWC Attrition Standard," Evaluation Review, , vol. 41(2), pages 130-154, April.
  • Handle: RePEc:sae:evarev:v:41:y:2017:i:2:p:130-154
    DOI: 10.1177/0193841X16670047
    as

    Download full text from publisher

    File URL: https://journals.sagepub.com/doi/10.1177/0193841X16670047
    Download Restriction: no

    File URL: https://libkey.io/10.1177/0193841X16670047?utm_source=ideas
    LibKey link: if access is restricted and if your library uses this service, LibKey will redirect you to where you can use your library subscription to access this item
    ---><---

    References listed on IDEAS

    as
    1. repec:mpr:mprres:6174 is not listed on IDEAS
    2. Jill Constantine & Daniel Player & Tim Silva & Kristin Hallgren & Mary Grider & John Deke, 2009. "An Evaluation of Teachers Trained Through Different Routes to Certification," Mathematica Policy Research Reports 1a9e2910a01843c2babbc8c64, Mathematica Policy Research.
    3. David S. Lee, 2009. "Training, Wages, and Sample Selection: Estimating Sharp Bounds on Treatment Effects," The Review of Economic Studies, Review of Economic Studies Ltd, vol. 76(3), pages 1071-1102.
    4. repec:mpr:mprres:4551 is not listed on IDEAS
    5. Susanne James-Burdumy & Mark Dynarski & Mary Moore & John Deke & Wendy Mansfield & Carol Pistorino, "undated". "When Schools Stay Open Late: The National Evaluation of the 21st-Century Community Learning Centers Program," Mathematica Policy Research Reports 86c8d763ea6c4acebca8464c5, Mathematica Policy Research.
    6. David Greenberg & Burt S. Barnow, 2014. "Flaws in Evaluations of Social Programs," Evaluation Review, , vol. 38(5), pages 359-387, October.
    7. repec:mpr:mprres:6175 is not listed on IDEAS
    8. repec:mpr:mprres:7338 is not listed on IDEAS
    9. Neil Seftor, "undated". "Raising the Bar," Mathematica Policy Research Reports 135e82100e784dad803fe9c89, Mathematica Policy Research.
    10. Peter Z. Schochet, "undated". "Statistical Power for Random Assignment Evaluations of Education Programs," Mathematica Policy Research Reports 6749d31ad72d4acf988f7dce5, Mathematica Policy Research.
    11. Manski, Charles F, 1990. "Nonparametric Bounds on Treatment Effects," American Economic Review, American Economic Association, vol. 80(2), pages 319-323, May.
    12. repec:mpr:mprres:5414 is not listed on IDEAS
    13. Andrews,Donald W. K. & Stock,James H. (ed.), 2005. "Identification and Inference for Econometric Models," Cambridge Books, Cambridge University Press, number 9780521844413.
    14. Susanne James-Burdumy & Mark Dynarski & John Deke, 2005. "When Elementary Schools Stay Open Late: Results from The National Evaluation of the 21st-Century Community Learning Centers Program," Mathematica Policy Research Reports 747640229207407f9f0f09abf, Mathematica Policy Research.
    15. repec:mpr:mprres:5863 is not listed on IDEAS
    16. repec:mpr:mprres:4871 is not listed on IDEAS
    17. repec:mpr:mprres:6585 is not listed on IDEAS
    18. Jill Constantine, 2009. "Evaluation of Teachers Trained Through Different Routes to Certification," Mathematica Policy Research Reports 6e759a791c77408a816add035, Mathematica Policy Research.
    Full references (including those not matched with items on IDEAS)

    Citations

    Citations are extracted by the CitEc Project, subscribe to its RSS feed for this item.
    as


    Cited by:

    1. Douglas J. Besharov & Douglas M. Call & Jason M. Scott, 2020. "PROTOCOL: Early childhood education programs for improving the development and achievement of low‐income children: a systematic review," Campbell Systematic Reviews, John Wiley & Sons, vol. 16(3), September.
    2. Jaime Thomas & Sarah A. Avellar & John Deke & Philip Gleason, 2017. "Matched Comparison Group Design Standards in Systematic Reviews of Early Childhood Interventions," Evaluation Review, , vol. 41(3), pages 240-279, June.
    3. Richard Hendra & Aaron Hill, 2019. "Rethinking Response Rates: New Evidence of Little Relationship Between Survey Response Rates and Nonresponse Bias," Evaluation Review, , vol. 43(5), pages 307-330, October.
    4. Jacob Alex Klerman, 2017. "Special Issue Editor’s Overview Essay," Evaluation Review, , vol. 41(3), pages 175-182, June.
    5. Ben Weidmann & Luke Miratrix, 2021. "Missing, presumed different: Quantifying the risk of attrition bias in education evaluations," Journal of the Royal Statistical Society Series A, Royal Statistical Society, vol. 184(2), pages 732-760, April.

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Lisa Dragoset & Jaime Thomas & Mariesa Herrmann & John Deke & Susanne James-Burdumy & Cheryl Graczewski & Andrea Boyle & Rachel Upton & Courtney Tanenbaum & Jessica Giffin, "undated". "School Improvement Grants: Implementation and Effectiveness (Final Report)," Mathematica Policy Research Reports 76bce3f4bb0944f29a481fae0, Mathematica Policy Research.
    2. Lisa Dragoset & Susanne James-Burdumy & Kristin Hallgren & Irma Perez-Johnson & Mariesa Herrmann & Christina Tuttle & Megan Hague Angus & Rebecca Herman & Matthew Murray & Courtney Tanenbaum & Cheryl , 2015. "Usage of Practices Promoted by School Improvement Grants," Mathematica Policy Research Reports 8e99f01663504ef5b9f8357f6, Mathematica Policy Research.
    3. John Deke, "undated". "Causal Validity Considerations for Including High Quality Non-Experimental Evidence in Systematic Reviews," Mathematica Policy Research Reports 676a04feb19e4904a052ba2e7, Mathematica Policy Research.
    4. John Deke & Thomas Wei & Tim Kautz, "undated". "Asymdystopia: The Threat of Small Biases in Evaluations of Education Interventions that Need to be Powered to Detect Small Impacts," Mathematica Policy Research Reports f0ff8f86e3c34dc8baaf22b56, Mathematica Policy Research.
    5. Peter Z. Schochet & Hanley Chiang, "undated". "Technical Methods Report: Estimation and Identification of the Complier Average Causal Effect Parameter in Education RCTs," Mathematica Policy Research Reports 947d1823e3ff42208532a763d, Mathematica Policy Research.
    6. Kaitlin Anderson & Gema Zamarro & Jennifer Steele & Trey Miller, 2021. "Comparing Performance of Methods to Deal With Differential Attrition in Randomized Experimental Evaluations," Evaluation Review, , vol. 45(1-2), pages 70-104, February.
    7. Turner, Alex J. & Fichera, Eleonora & Sutton, Matt, 2021. "The effects of in-utero exposure to influenza on mental health and mortality risk throughout the life-course," Economics & Human Biology, Elsevier, vol. 43(C).
    8. Kjetil Bjorvatn & Alexander W. Cappelen & Linda Helgesson Sekei & Erik Ø. Sørensen & Bertil Tungodden, 2020. "Teaching Through Television: Experimental Evidence on Entrepreneurship Education in Tanzania," Management Science, INFORMS, vol. 66(6), pages 2308-2325, June.
    9. McGovern, Mark E. & Canning, David & Bärnighausen, Till, 2018. "Accounting for non-response bias using participation incentives and survey design: An application using gift vouchers," Economics Letters, Elsevier, vol. 171(C), pages 239-244.
    10. Vegas, E & Ganimian, A. J., 2013. "Theory and Evidence on Teacher Policies in Developed and Developing Countries," Working Paper 104291, Harvard University OpenScholar.
    11. repec:mpr:mprres:6286 is not listed on IDEAS
    12. Semenova, Vira, 2023. "Debiased machine learning of set-identified linear models," Journal of Econometrics, Elsevier, vol. 235(2), pages 1725-1746.
    13. Matthew A. Kraft, 2014. "How to Make Additional Time Matter: Integrating Individualized Tutorials into an Extended Day," Education Finance and Policy, MIT Press, vol. 10(1), pages 81-116, November.
    14. Roland G. Fryer, Jr, 2016. "The Production of Human Capital in Developed Countries: Evidence from 196 Randomized Field Experiments," NBER Working Papers 22130, National Bureau of Economic Research, Inc.
    15. Lukáš Lafférs, 2019. "Identification in Models with Discrete Variables," Computational Economics, Springer;Society for Computational Economics, vol. 53(2), pages 657-696, February.
    16. Guido W. Imbens & Jeffrey M. Wooldridge, 2009. "Recent Developments in the Econometrics of Program Evaluation," Journal of Economic Literature, American Economic Association, vol. 47(1), pages 5-86, March.
    17. John Deke & Lisa Dragoset, "undated". "Statistical Power for Regression Discontinuity Designs in Education: Empirical Estimates of Design Effects Relative to Randomized Controlled Trials," Mathematica Policy Research Reports a4f1d03eb7bf427a8983d4736, Mathematica Policy Research.
    18. Bo E. Honoré & Luojia Hu, 2020. "Selection Without Exclusion," Econometrica, Econometric Society, vol. 88(3), pages 1007-1029, May.
    19. François Gerard & Miikka Rokkanen & Christoph Rothe, 2020. "Bounds on treatment effects in regression discontinuity designs with a manipulated running variable," Quantitative Economics, Econometric Society, vol. 11(3), pages 839-870, July.
    20. Vira Semenova, 2023. "Adaptive Estimation of Intersection Bounds: a Classification Approach," Papers 2303.00982, arXiv.org.
    21. Cabrera Hernández, Francisco-Javier, 2016. "Essays on the impact evaluation of education policies in Mexico," Economics PhD Theses 0316, Department of Economics, University of Sussex Business School.

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:sae:evarev:v:41:y:2017:i:2:p:130-154. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: SAGE Publications (email available below). General contact details of provider: .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.