IDEAS home Printed from https://ideas.repec.org/a/sae/evarev/v40y2016i5p410-443.html
   My bibliography  Save this article

An Empirical Study of Design Parameters for Assessing Differential Impacts for Students in Group Randomized Trials

Author

Listed:
  • Andrew P. Jaciw
  • Li Lin
  • Boya Ma

Abstract

Background: Prior research has investigated design parameters for assessing average program impacts on achievement outcomes with cluster randomized trials (CRTs). Less is known about parameters important for assessing differential impacts. Objectives: This article develops a statistical framework for designing CRTs to assess differences in impact among student subgroups and presents initial estimates of critical parameters. Research design: Effect sizes and minimum detectable effect sizes for average and differential impacts are calculated before and after conditioning on effects of covariates using results from several CRTs. Relative sensitivities to detect average and differential impacts are also examined. Subjects: Student outcomes from six CRTs are analyzed. Measures: Achievement in math, science, reading, and writing. Results: The ratio of between-cluster variation in the slope of the moderator divided by total variance—the “moderator gap variance ratio†—is important for designing studies to detect differences in impact between student subgroups. This quantity is the analogue of the intraclass correlation coefficient. Typical values were .02 for gender and .04 for socioeconomic status. For studies considered, in many cases estimates of differential impact were larger than of average impact, and after conditioning on effects of covariates, similar power was achieved for detecting average and differential impacts of the same size. Conclusions: Measuring differential impacts is important for addressing questions of equity, generalizability, and guiding interpretation of subgroup impact findings. Adequate power for doing this is in some cases reachable with CRTs designed to measure average impacts. Continuing collection of parameters for assessing differential impacts is the next step.

Suggested Citation

  • Andrew P. Jaciw & Li Lin & Boya Ma, 2016. "An Empirical Study of Design Parameters for Assessing Differential Impacts for Students in Group Randomized Trials," Evaluation Review, , vol. 40(5), pages 410-443, October.
  • Handle: RePEc:sae:evarev:v:40:y:2016:i:5:p:410-443
    DOI: 10.1177/0193841X16659600
    as

    Download full text from publisher

    File URL: https://journals.sagepub.com/doi/10.1177/0193841X16659600
    Download Restriction: no

    File URL: https://libkey.io/10.1177/0193841X16659600?utm_source=ideas
    LibKey link: if access is restricted and if your library uses this service, LibKey will redirect you to where you can use your library subscription to access this item
    ---><---

    References listed on IDEAS

    as
    1. Gelman, Andrew & Stern, Hal, 2006. "The Difference Between," The American Statistician, American Statistical Association, vol. 60, pages 328-331, November.
    2. Peter Z. Schochet, "undated". "Statistical Power for Random Assignment Evaluations of Education Programs," Mathematica Policy Research Reports 6749d31ad72d4acf988f7dce5, Mathematica Policy Research.
    3. repec:mpr:mprres:6585 is not listed on IDEAS
    Full references (including those not matched with items on IDEAS)

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. World Bank, 2017. "Pre-Primary Education in Mongolia," World Bank Publications - Reports 26402, The World Bank Group.
    2. Colin F. Camerer & Anna Dreber & Felix Holzmeister & Teck-Hua Ho & Jürgen Huber & Magnus Johannesson & Michael Kirchler & Gideon Nave & Brian A. Nosek & Thomas Pfeiffer & Adam Altmejd & Nick Buttrick , 2018. "Evaluating the replicability of social science experiments in Nature and Science between 2010 and 2015," Nature Human Behaviour, Nature, vol. 2(9), pages 637-644, September.
    3. repec:mpr:mprres:6286 is not listed on IDEAS
    4. Lukas Haffert, 2019. "War mobilization or war destruction? The unequal rise of progressive taxation revisited," The Review of International Organizations, Springer, vol. 14(1), pages 59-82, March.
    5. Michael A. Allen & Michael E. Flynn & Julie VanDusky-Allen, 2017. "Regions of Hierarchy and Security: US Troop Deployments, Spatial Relations, and Defense Burdens," International Interactions, Taylor & Francis Journals, vol. 43(3), pages 397-423, May.
    6. Kenneth Fortson & Natalya Verbitsky-Savitz & Emma Kopa & Philip Gleason, 2012. "Using an Experimental Evaluation of Charter Schools to Test Whether Nonexperimental Comparison Group Methods Can Replicate Experimental Impact Estimates," Mathematica Policy Research Reports 27f871b5b7b94f3a80278a593, Mathematica Policy Research.
    7. Deborah Peikes & Stacy Dale & Eric Lundquist & Janice Genevro & David Meyers, 2011. "Building the Evidence Base for the Medical Home: What Sample and Sample Size Do Studies Need?," Mathematica Policy Research Reports 5814eb8219b24982af7f7536c, Mathematica Policy Research.
    8. David Spiegelhalter, 2017. "Trust in numbers," Journal of the Royal Statistical Society Series A, Royal Statistical Society, vol. 180(4), pages 948-965, October.
    9. repec:mpr:mprres:6094 is not listed on IDEAS
    10. John Deke, 2016. "Design and Analysis Considerations for Cluster Randomized Controlled Trials That Have a Small Number of Clusters," Evaluation Review, , vol. 40(5), pages 444-486, October.
    11. Ignacio, Escañuela Romana, 2019. "The elasticities of passenger transport demand in the Northeast Corridor," Research in Transportation Economics, Elsevier, vol. 78(C).
    12. Peter Z. Schochet, 2020. "Analyzing Grouped Administrative Data for RCTs Using Design-Based Methods," Journal of Educational and Behavioral Statistics, , vol. 45(1), pages 32-57, February.
    13. Gezahegn, Tafesse & Van Passel, Steven & Berhanu, Tekeste & D'Haese, Marijke & Maertens, Miet, 2020. "Structural and Institutional Heterogeneity among Agricultural Cooperatives in Ethiopia: Does it Matter for Farmers’ Welfare?," Journal of Agricultural and Resource Economics, Western Agricultural Economics Association, vol. 46(2), August.
    14. Peter Z. Schochet & Hanley S. Chiang, "undated". "Error Rates in Measuring Teacher and School Performance Based on Student Test Score Gains," Mathematica Policy Research Reports d415285f980b4d64b7e75f40b, Mathematica Policy Research.
    15. Li, Xiaoyu & Kawachi, Ichiro & Buxton, Orfeu M. & Haneuse, Sebastien & Onnela, Jukka-Pekka, 2019. "Social network analysis of group position, popularity, and sleep behaviors among U.S. adolescents," Social Science & Medicine, Elsevier, vol. 232(C), pages 417-426.
    16. Marko Hofmann & Silja Meyer-Nieberg, 2018. "Time to dispense with the p-value in OR?," Central European Journal of Operations Research, Springer;Slovak Society for Operations Research;Hungarian Operational Research Society;Czech Society for Operations Research;Österr. Gesellschaft für Operations Research (ÖGOR);Slovenian Society Informatika - Section for Operational Research;Croatian Operational Research Society, vol. 26(1), pages 193-214, March.
    17. Melisa Bubonya & Deborah A. Cobb-Clark & Daniel Christensen & Sarah E. Johnson & Stephen R. Zubrick, 2019. "The Great Recession and Children’s Mental Health in Australia," IJERPH, MDPI, vol. 16(4), pages 1-19, February.
    18. Thomas Ferguson & Hans-Joachim Voth, 2008. "Betting on Hitler—The Value of Political Connections in Nazi Germany," The Quarterly Journal of Economics, President and Fellows of Harvard College, vol. 123(1), pages 101-137.
    19. Nikolova, Milena, 2018. "Self-Employment Can Be Good for Your Health," GLO Discussion Paper Series 226, Global Labor Organization (GLO).
    20. Simplice A. Asongu & Nicholas M. Odhiambo, 2020. "The role of governance in quality education in sub-Saharan Africa," Working Papers of the African Governance and Development Institute. 20/077, African Governance and Development Institute..
    21. Christopher H. Rhoads, 2011. "The Implications of “Contamination†for Experimental Design in Education," Journal of Educational and Behavioral Statistics, , vol. 36(1), pages 76-104, February.
    22. Peter Z. Schochet, 2021. "Long‐Run Labor Market Effects of the Job Corps Program: Evidence from a Nationally Representative Experiment," Journal of Policy Analysis and Management, John Wiley & Sons, Ltd., vol. 40(1), pages 128-157, January.

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:sae:evarev:v:40:y:2016:i:5:p:410-443. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: SAGE Publications (email available below). General contact details of provider: .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.