IDEAS home Printed from https://ideas.repec.org/a/sae/evarev/v38y2014i6p546-582.html
   My bibliography  Save this article

Reference Values of Within-District Intraclass Correlations of Academic Achievement by District Characteristics

Author

Listed:
  • E. C. Hedberg
  • Larry V. Hedges

Abstract

Background: Randomized experiments are often considered the strongest designs to study the impact of educational interventions. Perhaps the most prevalent class of designs used in large-scale education experiments is the cluster randomized design in which entire schools are assigned to treatments. In cluster randomized trials that assign schools to treatments within a set of school districts, the statistical power of the test for treatment effects depends on the within-district school-level intraclass correlation (ICC). Hedges and Hedberg (2014) recently computed within-district ICC values in 11 states using three-level models (students in schools in districts) that pooled results across all the districts within each state. Although values from these analyses are useful when working with a representative sample of districts, they may be misleading for other samples of districts because the magnitude of ICCs appears to be related to district size. To plan studies with small or nonrepresentative samples of districts, better information are needed about the relation of within-district school-level ICCs to district size. Objective: Our objective is to explore the relation between district size and within-district ICCs to provide reference values for math and reading achievement for Grades 3–8 by district size, poverty level, and urbanicity level. These values are not derived from pooling across all districts within a state as in previous work but are based on the direct calculation of within-district school-level ICCs for each school district. Research Design: We use mixed models to estimate over 7,000 district-specific ICCs for math and reading achievement in 11 states and for Grades 3–8. We then perform a random effects meta-analysis on the estimated within-district ICCs. Our analysis is performed by grade and subject for different strata designated by district size (number of schools), urbanicity, and poverty rates.

Suggested Citation

  • E. C. Hedberg & Larry V. Hedges, 2014. "Reference Values of Within-District Intraclass Correlations of Academic Achievement by District Characteristics," Evaluation Review, , vol. 38(6), pages 546-582, December.
  • Handle: RePEc:sae:evarev:v:38:y:2014:i:6:p:546-582
    DOI: 10.1177/0193841X14554212
    as

    Download full text from publisher

    File URL: https://journals.sagepub.com/doi/10.1177/0193841X14554212
    Download Restriction: no

    File URL: https://libkey.io/10.1177/0193841X14554212?utm_source=ideas
    LibKey link: if access is restricted and if your library uses this service, LibKey will redirect you to where you can use your library subscription to access this item
    ---><---

    References listed on IDEAS

    as
    1. repec:mpr:mprres:6586 is not listed on IDEAS
    2. repec:mpr:mprres:5863 is not listed on IDEAS
    3. repec:mpr:mprres:6365 is not listed on IDEAS
    4. repec:mpr:mprres:6816 is not listed on IDEAS
    5. Larry V. Hedges & E. C. Hedberg, 2013. "Intraclass Correlations and Covariate Outcome Correlations for Planning Two- and Three-Level Cluster-Randomized Experiments in Education," Evaluation Review, , vol. 37(6), pages 445-489, December.
    6. repec:mpr:mprres:7064 is not listed on IDEAS
    7. Eric C. Hedberg, 2011. "RDPOWER: Stata module to perform power calculations for random designs," Statistical Software Components S457260, Boston College Department of Economics, revised 12 Feb 2012.
    8. Roberto Agodini & Barbara Harris & Sally Atkins-Burnett & Sheila Heaviside & Timothy Novak & Robert Murphy, "undated". "Achievement Effects of Four Early Elementary School Math Curricula: Findings from First Graders in 39 Schools," Mathematica Policy Research Reports 467194cd2d6f4cbaba7e23745, Mathematica Policy Research.
    9. Peter Z. Schochet, "undated". "Statistical Power for Random Assignment Evaluations of Education Programs," Mathematica Policy Research Reports 6749d31ad72d4acf988f7dce5, Mathematica Policy Research.
    Full references (including those not matched with items on IDEAS)

    Citations

    Citations are extracted by the CitEc Project, subscribe to its RSS feed for this item.
    as


    Cited by:

    1. Zachary Neal, 2021. "Does the neighbourhood matter for neighbourhood satisfaction? A meta-analysis," Urban Studies, Urban Studies Journal Limited, vol. 58(9), pages 1775-1791, July.
    2. E. C. Hedberg, 2016. "Academic and Behavioral Design Parameters for Cluster Randomized Trials in Kindergarten," Evaluation Review, , vol. 40(4), pages 279-313, August.
    3. Nathan M. VanHoudnos & Joel B. Greenhouse, 2016. "On the Hedges Correction for a t-Test," Journal of Educational and Behavioral Statistics, , vol. 41(4), pages 392-419, August.

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Elizabeth Tipton & Robert B. Olsen, "undated". "Enhancing the Generalizability of Impact Studies in Education," Mathematica Policy Research Reports 35d5625333dc480aba9765b3b, Mathematica Policy Research.
    2. Jessaca Spybrook & Benjamin Kelcey, 2016. "Introduction to Three Special Issues on Design Parameter Values for Planning Cluster Randomized Trials in the Social Sciences," Evaluation Review, , vol. 40(6), pages 491-499, December.
    3. Ben Kelcey & Zuchao Shen & Jessaca Spybrook, 2016. "Intraclass Correlation Coefficients for Designing Cluster-Randomized Trials in Sub-Saharan Africa Education," Evaluation Review, , vol. 40(6), pages 500-525, December.
    4. World Bank, 2017. "Pre-Primary Education in Mongolia," World Bank Publications - Reports 26402, The World Bank Group.
    5. Nianbo Dong & Wendy M. Reinke & Keith C. Herman & Catherine P. Bradshaw & Desiree W. Murray, 2016. "Meaningful Effect Sizes, Intraclass Correlations, and Proportions of Variance Explained by Covariates for Planning Two- and Three-Level Cluster Randomized Trials of Social and Behavioral Outcomes," Evaluation Review, , vol. 40(4), pages 334-377, August.
    6. repec:mpr:mprres:6286 is not listed on IDEAS
    7. Kenneth Fortson & Natalya Verbitsky-Savitz & Emma Kopa & Philip Gleason, 2012. "Using an Experimental Evaluation of Charter Schools to Test Whether Nonexperimental Comparison Group Methods Can Replicate Experimental Impact Estimates," Mathematica Policy Research Reports 27f871b5b7b94f3a80278a593, Mathematica Policy Research.
    8. Daniel McNeish & Jeffrey R. Harring & Denis Dumas, 2023. "A multilevel structured latent curve model for disaggregating student and school contributions to learning," Statistical Methods & Applications, Springer;Società Italiana di Statistica, vol. 32(2), pages 545-575, June.
    9. Deborah Peikes & Stacy Dale & Eric Lundquist & Janice Genevro & David Meyers, 2011. "Building the Evidence Base for the Medical Home: What Sample and Sample Size Do Studies Need?," Mathematica Policy Research Reports 5814eb8219b24982af7f7536c, Mathematica Policy Research.
    10. Peter Z. Schochet, 2018. "Design-Based Estimators for Average Treatment Effects for Multi-Armed RCTs," Journal of Educational and Behavioral Statistics, , vol. 43(5), pages 568-593, October.
    11. Andrew P. Jaciw & Li Lin & Boya Ma, 2016. "An Empirical Study of Design Parameters for Assessing Differential Impacts for Students in Group Randomized Trials," Evaluation Review, , vol. 40(5), pages 410-443, October.
    12. repec:mpr:mprres:6094 is not listed on IDEAS
    13. John Deke, 2016. "Design and Analysis Considerations for Cluster Randomized Controlled Trials That Have a Small Number of Clusters," Evaluation Review, , vol. 40(5), pages 444-486, October.
    14. John Deke & Lisa Dragoset, "undated". "Statistical Power for Regression Discontinuity Designs in Education: Empirical Estimates of Design Effects Relative to Randomized Controlled Trials," Mathematica Policy Research Reports a4f1d03eb7bf427a8983d4736, Mathematica Policy Research.
    15. repec:mpr:mprres:7909 is not listed on IDEAS
    16. Peter Z. Schochet, 2020. "Analyzing Grouped Administrative Data for RCTs Using Design-Based Methods," Journal of Educational and Behavioral Statistics, , vol. 45(1), pages 32-57, February.
    17. Cory Koedel & Rachana Bhatt, 2011. "Large-Scale Evaluations of Curricular Effectiveness: The Case of Elementary Mathematics in Indiana," Working Papers 1122, Department of Economics, University of Missouri, revised 31 Jan 2012.
    18. Peter Z. Schochet & Hanley S. Chiang, "undated". "Error Rates in Measuring Teacher and School Performance Based on Student Test Score Gains," Mathematica Policy Research Reports d415285f980b4d64b7e75f40b, Mathematica Policy Research.
    19. Christopher H. Rhoads, 2011. "The Implications of “Contamination†for Experimental Design in Education," Journal of Educational and Behavioral Statistics, , vol. 36(1), pages 76-104, February.
    20. Peter Z. Schochet, 2021. "Long‐Run Labor Market Effects of the Job Corps Program: Evidence from a Nationally Representative Experiment," Journal of Policy Analysis and Management, John Wiley & Sons, Ltd., vol. 40(1), pages 128-157, January.
    21. Peter Z. Schochet, 2013. "Student Mobility, Dosage, and Principal Stratification in School-Based RCTs," Journal of Educational and Behavioral Statistics, , vol. 38(4), pages 323-354, August.
    22. repec:mpr:mprres:7219 is not listed on IDEAS
    23. de Hoyos, Rafael & Garcia-Moreno, Vicente A. & Patrinos, Harry Anthony, 2017. "The impact of an accountability intervention with diagnostic feedback: Evidence from Mexico," Economics of Education Review, Elsevier, vol. 58(C), pages 123-140.
    24. Yang Tang & Thomas D. Cook, 2018. "Statistical Power for the Comparative Regression Discontinuity Design With a Pretest No-Treatment Control Function: Theory and Evidence From the National Head Start Impact Study," Evaluation Review, , vol. 42(1), pages 71-110, February.

    More about this item

    Keywords

    education; methodological development;

    Statistics

    Access and download statistics

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:sae:evarev:v:38:y:2014:i:6:p:546-582. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: SAGE Publications (email available below). General contact details of provider: .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.