IDEAS home Printed from https://ideas.repec.org/p/ucl/cepeow/20-16.html
   My bibliography  Save this paper

Quantifying 'promising trials bias' in randomized controlled trials in education

Author

Listed:
  • Sam Sims

    (Centre for Education Policy and Equaliising Opportunities, UCL Institute of Education, University College London)

  • Jake Anders

    (Centre for Education Policy and Equaliising Opportunities, UCL Institute of Education, University College London)

  • Matthew Inglis

    (Centre for Mathematical Cognition, Loughborough University)

  • Hugues Lortie-Forgues

    (Centre for Mathematical Cognition, Loughborough University)

Abstract

Randomized controlled trials have proliferated in education, in part because they provide an unbiased estimator for the causal impact of interventions. It is increasingly recognized that many such trials in education have low power to detect an effect, if indeed there is one. However, it is less well known that low powered trials tend to systematically exaggerate effect sizes among the subset of interventions that show promising results. We conduct a retrospective design analysis to quantify this bias across 23 promising trials, finding that the estimated effect sizes are exaggerated by an average of 52% or more. Promising trials bias can be reduced ex-ante by increasing the power of the trials that are commissioned and guarded against ex-post by including estimates of the exaggeration ratio when reporting trial findings. Our results also suggest that challenges around implementation fidelity are not the only reason that apparently successful interventions often fail to subsequently scale up. Instead, the findings from the initial promising trial may simply have been exaggerated.Length: 19 pages

Suggested Citation

  • Sam Sims & Jake Anders & Matthew Inglis & Hugues Lortie-Forgues, 2020. "Quantifying 'promising trials bias' in randomized controlled trials in education," CEPEO Working Paper Series 20-16, UCL Centre for Education Policy and Equalising Opportunities, revised Nov 2020.
  • Handle: RePEc:ucl:cepeow:20-16
    as

    Download full text from publisher

    File URL: https://repec-cepeo.ucl.ac.uk/cepeow/cepeowp20-16.pdf
    File Function: First version, 2020
    Download Restriction: no
    ---><---

    References listed on IDEAS

    as
    1. Mayo, Deborah & Morey, Richard Donald, 2017. "A Poor Prognosis for the Diagnostic Screening Critique of Statistical Tests," OSF Preprints ps38b, Center for Open Science.
    2. David Colquhoun, 2019. "The False Positive Risk: A Proposal Concerning What to Do About p-Values," The American Statistician, Taylor & Francis Journals, vol. 73(S1), pages 192-201, March.
    3. Matthew A. Kraft & Manuel Monti-Nussbaum, 2017. "Can Schools Enable Parents to Prevent Summer Learning Loss? A Text-Messaging Field Experiment to Promote Literacy Skills," The ANNALS of the American Academy of Political and Social Science, , vol. 674(1), pages 85-112, November.
    4. Kosuke Imai & Gary King & Elizabeth A. Stuart, 2008. "Misunderstandings between experimentalists and observationalists about causal inference," Journal of the Royal Statistical Society Series A, Royal Statistical Society, vol. 171(2), pages 481-502, April.
    Full references (including those not matched with items on IDEAS)

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Marco Caliendo & Stefan Tübbicke, 2020. "New evidence on long-term effects of start-up subsidies: matching estimates and their robustness," Empirical Economics, Springer, vol. 59(4), pages 1605-1631, October.
    2. Christoph Dworschak, 2024. "Bias mitigation in empirical peace and conflict studies: A short primer on posttreatment variables," Journal of Peace Research, Peace Research Institute Oslo, vol. 61(3), pages 462-476, May.
    3. Jasjeet Singh Sekhon & Richard D. Grieve, 2012. "A matching method for improving covariate balance in cost‐effectiveness analyses," Health Economics, John Wiley & Sons, Ltd., vol. 21(6), pages 695-714, June.
    4. Shen, Chung-Hua & Wu, Meng-Wen & Chen, Ting-Hsuan & Fang, Hao, 2016. "To engage or not to engage in corporate social responsibility: Empirical evidence from global banking sector," Economic Modelling, Elsevier, vol. 55(C), pages 207-225.
    5. Berta, Paolo & Callea, Giuditta & Martini, Gianmaria & Vittadini, Giorgio, 2010. "The effects of upcoding, cream skimming and readmissions on the Italian hospitals efficiency: A population-based investigation," Economic Modelling, Elsevier, vol. 27(4), pages 812-821, July.
    6. Lisa van der Sande & Ilona Wildeman & Adriana G. Bus & Roel van Steensel, 2023. "Nudging to Stimulate Reading in Primary and Secondary Education," SAGE Open, , vol. 13(2), pages 21582440231, April.
    7. Denis Fougère & Nicolas Jacquemet, 2020. "Policy Evaluation Using Causal Inference Methods," SciencePo Working papers Main hal-03455978, HAL.
    8. Maurizio Canavari & Andreas C. Drichoutis & Jayson L. Lusk & Rodolfo M. Nayga, Jr., 2018. "How to run an experimental auction: A review of recent advances," Working Papers 2018-5, Agricultural University of Athens, Department Of Agricultural Economics.
    9. Jan Stede, 2019. "Do Energy Efficiency Networks Save Energy? Evidence from German Plant-Level Data," Discussion Papers of DIW Berlin 1813, DIW Berlin, German Institute for Economic Research.
    10. Elizabeth A. Stuart & Anna Rhodes, 2017. "Generalizing Treatment Effect Estimates From Sample to Population: A Case Study in the Difficulties of Finding Sufficient Data," Evaluation Review, , vol. 41(4), pages 357-388, August.
    11. Simone Bertoli & Francesca Marchetta, 2014. "Migration, Remittances and Poverty in Ecuador," Journal of Development Studies, Taylor & Francis Journals, vol. 50(8), pages 1067-1089, August.
    12. Oliver Tiemann & Jonas Schreyögg, 2012. "Changes in hospital efficiency after privatization," Health Care Management Science, Springer, vol. 15(4), pages 310-326, December.
    13. Patrick Christian Feihle & Jochen Lawrenz, 2017. "The Issuance of German SME Bonds and its Impact on Operating Performance," Schmalenbach Business Review, Springer;Schmalenbach-Gesellschaft, vol. 18(3), pages 227-259, August.
    14. Ahmadiani, Mona & Ferreira, Susana, 2021. "Well-being effects of extreme weather events in the United States," Resource and Energy Economics, Elsevier, vol. 64(C).
    15. Silvio Daidone & Sudhanshu Handa & Benjamin Davis & Mike Park & Robert D. Osei & Isaac Osei-Akoto, 2015. "Social Networks and Risk Management in Ghana’s Livelihood Empowerment Against Poverty Programme," Papers inwopa781, Innocenti Working Papers.
    16. Carolyn Riehl & Melissa A. Lyon, 2017. "Counting on Context: Cross-Sector Collaborations for Education and the Legacy of James Coleman’s Sociological Vision," The ANNALS of the American Academy of Political and Social Science, , vol. 674(1), pages 262-280, November.
    17. Tenglong Li & Kenneth A. Frank & Mingming Chen, 2024. "A Conceptual Framework for Quantifying the Robustness of a Regression-Based Causal Inference in Observational Study," Mathematics, MDPI, vol. 12(3), pages 1-14, January.
    18. Sharique Hasan & Surendrakumar Bagde, 2015. "Peers and Network Growth: Evidence from a Natural Experiment," Management Science, INFORMS, vol. 61(10), pages 2536-2547, October.
    19. Bunte, Jonas B. & Desai, Harsh & Gbala, Kanio & Parks, Bradley & Runfola, Daniel Miller, 2018. "Natural resource sector FDI, government policy, and economic growth: Quasi-experimental evidence from Liberia," World Development, Elsevier, vol. 107(C), pages 151-162.
    20. Behncke S, 2009. "How Does Retirement Affect Health?," Health, Econometrics and Data Group (HEDG) Working Papers 09/11, HEDG, c/o Department of Economics, University of York.

    More about this item

    Keywords

    randomized controlled trials; education; promising trials bias;
    All these keywords.

    JEL classification:

    • I20 - Health, Education, and Welfare - - Education - - - General
    • I21 - Health, Education, and Welfare - - Education - - - Analysis of Education
    • C90 - Mathematical and Quantitative Methods - - Design of Experiments - - - General
    • C93 - Mathematical and Quantitative Methods - - Design of Experiments - - - Field Experiments

    NEP fields

    This paper has been announced in the following NEP Reports:

    Statistics

    Access and download statistics

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:ucl:cepeow:20-16. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: Jake Anders (email available below). General contact details of provider: https://edirc.repec.org/data/epucluk.html .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.