IDEAS home Printed from https://ideas.repec.org/p/ucl/cepeow/23-07.html
   My bibliography  Save this paper

Experimental education research: rethinking why, how and when to use random assignment

Author

Listed:
  • Sam Sims

    (UCL Centre for Education Policy and Equaliising Opportunities, University College London)

  • Jake Anders

    (UCL Centre for Education Policy and Equaliising Opportunities, University College London)

  • Matthew Inglis

    (Centre for Mathematical Cognition, Loughborough University)

  • Hugues Lortie-Forgues

    (Centre for Mathematical Cognition, Loughborough University)

  • Ben Styles

    (NFER)

  • Ben Weidmann

    (Skills Lab, Harvard University)

Abstract

Over the last twenty years, education researchers have increasingly conducted randomised experiments with the goal of informing the decisions of educators and policymakers. Such experiments have generally employed broad, consequential, standardised outcome measures in the hope that this would allow decisionmakers to compare effectiveness of different approaches. However, a combination of small effect sizes, wide confidence intervals, and treatment effect heterogeneity means that researchers have largely failed to achieve this goal. We argue that quasi-experimental methods and multi-site trials will often be superior for informing educators' decisions on the grounds that they can achieve greater precision and better address heterogeneity. Experimental research remains valuable in applied education research. However, it should primarily be used to test theoretical models, which can in turn inform educators' mental models, rather than attempting to directly inform decision making. Since comparable effect size estimates are not of interest when testing educational theory, researchers can and should improve the power of theory-informing experiments by using more closely aligned (i.e., valid) outcome measures. We argue that this approach would reduce wasteful research spending and make the research that does go ahead more statistically informative, thus improving the return on investment in educational research.

Suggested Citation

  • Sam Sims & Jake Anders & Matthew Inglis & Hugues Lortie-Forgues & Ben Styles & Ben Weidmann, 2023. "Experimental education research: rethinking why, how and when to use random assignment," CEPEO Working Paper Series 23-07, UCL Centre for Education Policy and Equalising Opportunities, revised Aug 2023.
  • Handle: RePEc:ucl:cepeow:23-07
    as

    Download full text from publisher

    File URL: https://repec-cepeo.ucl.ac.uk/cepeow/cepeowp23-07r1.pdf
    File Function: Revised version, 2023
    Download Restriction: no
    ---><---

    References listed on IDEAS

    as
    1. Sam Sims & Harry Fletcher-Wood & Alison O'Mara-Eves & Sarah Cottingham & Claire Stansfield & Josh Goodrich & Jo Van Herwegen & Jake Anders, 2022. "Effective teacher professional development: new theory and a meta-analytic test," CEPEO Working Paper Series 22-02, UCL Centre for Education Policy and Equalising Opportunities, revised Jan 2022.
    2. Brodeur, Abel & Cook, Nikolai & Hartley, Jonathan & Heyes, Anthony, 2022. "Do Pre-Registration and Pre-analysis Plans Reduce p-Hacking and Publication Bias?," MetaArXiv uxf39, Center for Open Science.
    3. repec:mpr:mprres:2511 is not listed on IDEAS
    4. Duncan D. Chaplin & Thomas D. Cook & Jelena Zurovac & Jared S. Coopersmith & Mariel M. Finucane & Lauren N. Vollmer & Rebecca E. Morris, 2018. "The Internal And External Validity Of The Regression Discontinuity Design: A Meta‐Analysis Of 15 Within‐Study Comparisons," Journal of Policy Analysis and Management, John Wiley & Sons, Ltd., vol. 37(2), pages 403-429, March.
    5. Ben Ost & Anuj Gangopadhyaya & Jeffrey C. Schiman, 2017. "Comparing standard deviation effects across contexts," Education Economics, Taylor & Francis Journals, vol. 25(3), pages 251-265, May.
    6. Deaton, Angus & Cartwright, Nancy, 2018. "Understanding and misunderstanding randomized controlled trials," Social Science & Medicine, Elsevier, vol. 210(C), pages 2-21.
    7. Luke Keele & Corrine McConnaughy & Ismail White, 2012. "Strengthening the Experimenter’s Toolbox: Statistical Estimation of Internal Validity," American Journal of Political Science, John Wiley & Sons, vol. 56(2), pages 484-499, April.
    8. Alberto Abadie & Susan Athey & Guido W. Imbens & Jeffrey M. Wooldridge, 2020. "Sampling‐Based versus Design‐Based Uncertainty in Regression Analysis," Econometrica, Econometric Society, vol. 88(1), pages 265-296, January.
    9. Ben Weidmann & Luke Miratrix, 2021. "Lurking Inferential Monsters? Quantifying Selection Bias In Evaluations Of School Programs," Journal of Policy Analysis and Management, John Wiley & Sons, Ltd., vol. 40(3), pages 964-986, June.
    10. Jared Coopersmith & Thomas D. Cook & Jelena Zurovac & Duncan Chaplin & Lauren V. Forrow, 2022. "Internal And External Validity Of The Comparative Interrupted Time‐Series Design: A Meta‐Analysis," Journal of Policy Analysis and Management, John Wiley & Sons, Ltd., vol. 41(1), pages 252-277, January.
    11. Yew-Kwang Ng, 2003. "From preference to happiness: Towards a more complete welfare economics," Social Choice and Welfare, Springer;The Society for Social Choice and Welfare, vol. 20(2), pages 307-350, March.
    Full references (including those not matched with items on IDEAS)

    Citations

    Citations are extracted by the CitEc Project, subscribe to its RSS feed for this item.
    as


    Cited by:

    1. Sam Sims & Harry Fletcher-Wood & Thomas Godfrey-Faussett & Peps Mccrea & Stefanie Meliss, 2023. "Modelling evidence-based practice in initial teacher training: causal effects on teachers' skills, knowledge and self-efficacy," CEPEO Working Paper Series 23-09, UCL Centre for Education Policy and Equalising Opportunities, revised Aug 2023.

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Denis Fougère & Nicolas Jacquemet, 2020. "Policy Evaluation Using Causal Inference Methods," SciencePo Working papers Main hal-03455978, HAL.
    2. Christopher J. Ruhm, 2019. "Shackling the Identification Police?," Southern Economic Journal, John Wiley & Sons, vol. 85(4), pages 1016-1026, April.
    3. Martin, Will, 2021. "Tools for measuring the full impacts of agricultural interventions," IFPRI-MCC technical papers 2, International Food Policy Research Institute (IFPRI).
    4. Cristina Bellés-Obrero & María Lombardi, 2022. "Teacher Performance Pay and Student Learning: Evidence from a Nationwide Program in Peru," Economic Development and Cultural Change, University of Chicago Press, vol. 70(4), pages 1631-1669.
    5. Ashesh Rambachan & Jonathan Roth, 2020. "Design-Based Uncertainty for Quasi-Experiments," Papers 2008.00602, arXiv.org, revised Oct 2024.
    6. Derksen, Laura & Kerwin, Jason Theodore & Reynoso, Natalia Ordaz & Sterck, Olivier, 2021. "Appointments: A More Effective Commitment Device for Health Behaviors," SocArXiv y8gh7, Center for Open Science.
    7. Jesus Fernandez-Villaverde, 2020. "Simple Rules for a Complex World with Arti?cial Intelligence," PIER Working Paper Archive 20-010, Penn Institute for Economic Research, Department of Economics, University of Pennsylvania.
    8. Vellore Arthi & James Fenske, 2018. "Polygamy and child mortality: Historical and modern evidence from Nigeria’s Igbo," Review of Economics of the Household, Springer, vol. 16(1), pages 97-141, March.
    9. Alex Hollingsworth & Mike Huang & Ivan J. Rudik & Nicholas J. Sanders, 2020. "A Thousand Cuts: Cumulative Lead Exposure Reduces Academic Achievement," NBER Working Papers 28250, National Bureau of Economic Research, Inc.
    10. Irina Gemmo & Pierre-Carl Michaud & Olivia S. Mitchell, 2023. "Selection into Financial Education and Effects on Portfolio Choice," NBER Working Papers 31682, National Bureau of Economic Research, Inc.
    11. Özler, Berk & Çelik, Çiğdem & Cunningham, Scott & Cuevas, P. Facundo & Parisotto, Luca, 2021. "Children on the move: Progressive redistribution of humanitarian cash transfers among refugees," Journal of Development Economics, Elsevier, vol. 153(C).
    12. Andreas C Drichoutis & Rodolfo M Nayga, 2020. "Economic Rationality under Cognitive Load," The Economic Journal, Royal Economic Society, vol. 130(632), pages 2382-2409.
    13. Wallin, Annika & Wahlberg, Lena & Persson, Johannes & Dewitt, Barry, 2020. "“Science and proven experience”: How should the epistemology of medicine inform the regulation of healthcare?," Health Policy, Elsevier, vol. 124(8), pages 842-848.
    14. Wang Ning, 2018. "Law and the Economy: An Introduction to Coasian Law and Economics," Man and the Economy, De Gruyter, vol. 5(2), pages 1-13, December.
    15. Jörg Peters & Jörg Langbein & Gareth Roberts, 2018. "Generalization in the Tropics – Development Policy, Randomized Controlled Trials, and External Validity," The World Bank Research Observer, World Bank, vol. 33(1), pages 34-64.
    16. Han, Kevin & Basse, Guillaume & Bojinov, Iavor, 2024. "Population interference in panel experiments," Journal of Econometrics, Elsevier, vol. 238(1).
    17. Veisten, Knut, 2007. "Contingent valuation controversies: Philosophic debates about economic theory," Journal of Behavioral and Experimental Economics (formerly The Journal of Socio-Economics), Elsevier, vol. 36(2), pages 204-232, April.
    18. John A. List & Fatemeh Momeni & Yves Zenou, 2020. "The Social Side of Early Human Capital Formation: Using a Field Experiment to Estimate the Causal Impact of Neighborhoods," Working Papers 2020-187, Becker Friedman Institute for Research In Economics.
    19. Conti, Gabriella & Poupakis, Stavros & Ekamper, Peter & Bijwaard, Govert E. & Lumey, L.H., 2024. "Severe prenatal shocks and adolescent health: Evidence from the Dutch Hunger Winter," Economics & Human Biology, Elsevier, vol. 53(C).
    20. Elias Bouacida & Renaud Foucart, 2022. "Rituals of Reason," Working Papers 344119591, Lancaster University Management School, Economics Department.

    More about this item

    Keywords

    randomized controlled trials; education; research; experiments; policy;
    All these keywords.

    JEL classification:

    • I20 - Health, Education, and Welfare - - Education - - - General
    • I21 - Health, Education, and Welfare - - Education - - - Analysis of Education
    • C90 - Mathematical and Quantitative Methods - - Design of Experiments - - - General
    • C93 - Mathematical and Quantitative Methods - - Design of Experiments - - - Field Experiments

    NEP fields

    This paper has been announced in the following NEP Reports:

    Statistics

    Access and download statistics

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:ucl:cepeow:23-07. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: Jake Anders (email available below). General contact details of provider: https://edirc.repec.org/data/epucluk.html .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.