IDEAS home Printed from https://ideas.repec.org/r/mpr/mprres/6749d31ad72d4acf988f7dce5712609e.html
   My bibliography  Save this item

Statistical Power for Random Assignment Evaluations of Education Programs

Citations

Citations are extracted by the CitEc Project, subscribe to its RSS feed for this item.
as


Cited by:

  1. Susanne James-Burdumy & David Myers & John Deke & Wendy Mansfield & Russell Gersten & Joseph Dimino & Janice Dole & Lauren Liang & Sharon Vaughn & Meaghan Edmonds, 2006. "The National Evaluation of Reading Comprehension Interventions: Design Report," Mathematica Policy Research Reports e9f80b7dfbf6454fa70f3e59f, Mathematica Policy Research.
  2. Kenneth Fortson & Natalya Verbitsky-Savitz & Emma Kopa & Philip Gleason, 2012. "Using an Experimental Evaluation of Charter Schools to Test Whether Nonexperimental Comparison Group Methods Can Replicate Experimental Impact Estimates," Mathematica Policy Research Reports 27f871b5b7b94f3a80278a593, Mathematica Policy Research.
  3. Peter Z. Schochet, "undated". "Statistical Theory for the RCT-YES Software: Design-Based Causal Inference for RCTs," Mathematica Policy Research Reports a0c005c003c242308a92c02dc, Mathematica Policy Research.
  4. repec:mpr:mprres:7443 is not listed on IDEAS
  5. Deborah Peikes & Stacy Dale & Eric Lundquist & Janice Genevro & David Meyers, 2011. "Building the Evidence Base for the Medical Home: What Sample and Sample Size Do Studies Need?," Mathematica Policy Research Reports 5814eb8219b24982af7f7536c, Mathematica Policy Research.
  6. John Deke & Hanley Chiang, 2017. "The WWC Attrition Standard," Evaluation Review, , vol. 41(2), pages 130-154, April.
  7. Peter Z. Schochet, "undated". "The Late Pretest Problem in Randomized Control Trials of Education Interventions," Mathematica Policy Research Reports fb514df5dbb84a5dbea79865c, Mathematica Policy Research.
  8. Andrew P. Jaciw & Li Lin & Boya Ma, 2016. "An Empirical Study of Design Parameters for Assessing Differential Impacts for Students in Group Randomized Trials," Evaluation Review, , vol. 40(5), pages 410-443, October.
  9. World Bank, 2017. "Pre-Primary Education in Mongolia," World Bank Publications - Reports 26402, The World Bank Group.
  10. Geoffrey Phelps & Benjamin Kelcey & Nathan Jones & Shuangshuang Liu, 2016. "Informing Estimates of Program Effects for Studies of Mathematics Professional Development Using Teacher Content Knowledge Outcomes," Evaluation Review, , vol. 40(5), pages 383-409, October.
  11. repec:mpr:mprres:4962 is not listed on IDEAS
  12. Peter Z. Schochet, 2013. "Estimators for Clustered Education RCTs Using the Neyman Model for Causal Inference," Journal of Educational and Behavioral Statistics, , vol. 38(3), pages 219-238, June.
  13. Peter Z. Schochet & Hanley Chiang, "undated". "Technical Methods Report: Estimation and Identification of the Complier Average Causal Effect Parameter in Education RCTs," Mathematica Policy Research Reports 947d1823e3ff42208532a763d, Mathematica Policy Research.
  14. Peter Z. Schochet, 2020. "Analyzing Grouped Administrative Data for RCTs Using Design-Based Methods," Journal of Educational and Behavioral Statistics, , vol. 45(1), pages 32-57, February.
  15. Randall Juras, 2016. "Estimates of Intraclass Correlation Coefficients and Other Design Parameters for Studies of School-Based Nutritional Interventions," Evaluation Review, , vol. 40(4), pages 314-333, August.
  16. Tim Kautz & Kathleen Feeney & Hanley Chiang & Sarah Lauffer & Maria Bartlett & Charles Tilley, "undated". "Using a Survey of Social and Emotional Learning and School Climate to Inform Decisionmaking," Mathematica Policy Research Reports 34e434508fe24859b54434e73, Mathematica Policy Research.
  17. Peter Z. Schochet, "undated". "Technical Methods Report: Statistical Power for Regression Discontinuity Designs in Education Evaluations," Mathematica Policy Research Reports 61fb6c057561451a8a6074508, Mathematica Policy Research.
  18. Jessaca Spybrook & Benjamin Kelcey, 2016. "Introduction to Three Special Issues on Design Parameter Values for Planning Cluster Randomized Trials in the Social Sciences," Evaluation Review, , vol. 40(6), pages 491-499, December.
  19. Yang Tang & Thomas D. Cook, 2018. "Statistical Power for the Comparative Regression Discontinuity Design With a Pretest No-Treatment Control Function: Theory and Evidence From the National Head Start Impact Study," Evaluation Review, , vol. 42(1), pages 71-110, February.
  20. repec:mpr:mprres:6372 is not listed on IDEAS
  21. repec:mpr:mprres:6094 is not listed on IDEAS
  22. Giamattei, Marcus & Graf Lambsdorff, Johann, 2015. "classEx: An online software for classroom experiments," Passauer Diskussionspapiere, Volkswirtschaftliche Reihe V-68-15, University of Passau, Faculty of Business and Economics.
  23. Peter Z. Schochet & Hanley S. Chiang, 2013. "What Are Error Rates for Classifying Teacher and School Performance Using Value-Added Models?," Journal of Educational and Behavioral Statistics, , vol. 38(2), pages 142-171, April.
  24. Alexandra Resch & Jillian Berk & Lauren Akers, "undated". "Recognizing and Conducting Opportunistic Experiments in Education: A Guide for Policymakers and Researchers," Mathematica Policy Research Reports b58a999ab27a4cabafac5aa08, Mathematica Policy Research.
  25. John Deke, 2016. "Design and Analysis Considerations for Cluster Randomized Controlled Trials That Have a Small Number of Clusters," Evaluation Review, , vol. 40(5), pages 444-486, October.
  26. Christopher H. Rhoads, 2011. "The Implications of “Contamination†for Experimental Design in Education," Journal of Educational and Behavioral Statistics, , vol. 36(1), pages 76-104, February.
  27. repec:mpr:mprres:6286 is not listed on IDEAS
  28. Loyalka, Prashant & Song, Yingquan & Wei, Jianguo & Zhong, Weiping & Rozelle, Scott, 2013. "Information, college decisions and financial aid: Evidence from a cluster-randomized controlled trial in China," Economics of Education Review, Elsevier, vol. 36(C), pages 26-40.
  29. Peter Z. Schochet & Hanley S. Chiang, "undated". "Error Rates in Measuring Teacher and School Performance Based on Student Test Score Gains," Mathematica Policy Research Reports d415285f980b4d64b7e75f40b, Mathematica Policy Research.
  30. Elizabeth Tipton & Robert B. Olsen, "undated". "Enhancing the Generalizability of Impact Studies in Education," Mathematica Policy Research Reports 35d5625333dc480aba9765b3b, Mathematica Policy Research.
  31. repec:mpr:mprres:8126 is not listed on IDEAS
  32. Peter Z. Schochet, 2021. "Long‐Run Labor Market Effects of the Job Corps Program: Evidence from a Nationally Representative Experiment," Journal of Policy Analysis and Management, John Wiley & Sons, Ltd., vol. 40(1), pages 128-157, January.
  33. repec:mpr:mprres:6409 is not listed on IDEAS
  34. Peter Z. Schochet, 2013. "Student Mobility, Dosage, and Principal Stratification in School-Based RCTs," Journal of Educational and Behavioral Statistics, , vol. 38(4), pages 323-354, August.
  35. Jaime Thomas & Sarah A. Avellar & John Deke & Philip Gleason, 2017. "Matched Comparison Group Design Standards in Systematic Reviews of Early Childhood Interventions," Evaluation Review, , vol. 41(3), pages 240-279, June.
  36. Melguizo, Tatiana & Sanchez, Fabio & Velasco, Tatiana, 2016. "Credit for Low-Income Students and Access to and Academic Performance in Higher Education in Colombia: A Regression Discontinuity Approach," World Development, Elsevier, vol. 80(C), pages 61-77.
  37. Peter Z. Schochet, 2009. "Do Typical RCTs of Education Interventions Have Sufficient Statistical Power for Linking Impacts on Teacher Practice and Student Achievement Outcomes?," Mathematica Policy Research Reports 8bb2ecd6a142422db269c1e0b, Mathematica Policy Research.
  38. Peter Z. Schochet, 2021. "Statistical Power for Estimating Treatment Effects Using Difference-in-Differences and Comparative Interrupted Time Series Designs with Variation in Treatment Timing," Papers 2102.06770, arXiv.org, revised Oct 2021.
  39. Peter M. Steiner & Vivian C. Wong, 2018. "Assessing Correspondence Between Experimental and Nonexperimental Estimates in Within-Study Comparisons," Evaluation Review, , vol. 42(2), pages 214-247, April.
  40. Robin Jacob & Marie-Andree Somers & Pei Zhu & Howard Bloom, 2016. "The Validity of the Comparative Interrupted Time Series Design for Evaluating the Effect of School-Level Interventions," Evaluation Review, , vol. 40(3), pages 167-198, June.
  41. de Hoyos, Rafael & Garcia-Moreno, Vicente A. & Patrinos, Harry Anthony, 2017. "The impact of an accountability intervention with diagnostic feedback: Evidence from Mexico," Economics of Education Review, Elsevier, vol. 58(C), pages 123-140.
  42. Abe, Yasuyo & Gee, Kevin A., 2014. "Sensitivity analyses for clustered data: An illustration from a large-scale clustered randomized controlled trial in education," Evaluation and Program Planning, Elsevier, vol. 47(C), pages 26-34.
  43. repec:mpr:mprres:7219 is not listed on IDEAS
  44. E. C. Hedberg & Larry V. Hedges, 2014. "Reference Values of Within-District Intraclass Correlations of Academic Achievement by District Characteristics," Evaluation Review, , vol. 38(6), pages 546-582, December.
  45. Christopher Rhoads, 2017. "Coherent Power Analysis in Multilevel Studies Using Parameters From Surveys," Journal of Educational and Behavioral Statistics, , vol. 42(2), pages 166-194, April.
  46. repec:mpr:mprres:6721 is not listed on IDEAS
  47. Rebecca A. Maynard, 2006. "Presidential address: Evidence-based decision making: What will it take for the decision makers to care?," Journal of Policy Analysis and Management, John Wiley & Sons, Ltd., vol. 25(2), pages 249-265.
  48. Peter Z. Schochet, 2010. "The Late Pretest Problem in Randomized Control Trials of Education Interventions," Journal of Educational and Behavioral Statistics, , vol. 35(4), pages 379-406, August.
IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.