IDEAS home Printed from https://ideas.repec.org/p/osf/osfxxx/s2j4r.html
   My bibliography  Save this paper

How often does random assignment fail? Estimates and recommendations

Author

Listed:
  • Goldberg, Matthew H.

Abstract

A fundamental goal of the scientific process is to make causal inferences. Random assignment to experimental conditions has been taken to be a gold-standard technique for establishing causality. Despite this, it is unclear how often random assignment fails to eliminate non-trivial differences between experimental conditions. Further, it is unknown to what extent larger sample sizes mitigates this issue. Chance differences between experimental conditions may be especially important when investigating topics that are highly sample-dependent, such as climate change and other politicized issues. Three studies examine simulated data (Study 1), three real datasets from original environmental psychology experiments (Study 2), and one nationally-representative dataset (Study 3) and find that differences between conditions that remain after random assignment are surprisingly common for sample sizes typical of social psychological scientific experiments. Methods and practices for identifying and mitigating such differences are discussed, and point to implications that are especially relevant to experiments in social and environmental psychology.

Suggested Citation

  • Goldberg, Matthew H., 2019. "How often does random assignment fail? Estimates and recommendations," OSF Preprints s2j4r, Center for Open Science.
  • Handle: RePEc:osf:osfxxx:s2j4r
    DOI: 10.31219/osf.io/s2j4r
    as

    Download full text from publisher

    File URL: https://osf.io/download/5ca7670f2aa0d90016d1972c/
    Download Restriction: no

    File URL: https://libkey.io/10.31219/osf.io/s2j4r?utm_source=ideas
    LibKey link: if access is restricted and if your library uses this service, LibKey will redirect you to where you can use your library subscription to access this item
    ---><---

    References listed on IDEAS

    as
    1. John P A Ioannidis, 2005. "Why Most Published Research Findings Are False," PLOS Medicine, Public Library of Science, vol. 2(8), pages 1-1, August.
    2. Charness, Gary & Gneezy, Uri & Kuhn, Michael A., 2012. "Experimental methods: Between-subject and within-subject design," Journal of Economic Behavior & Organization, Elsevier, vol. 81(1), pages 1-8.
    3. Shadish, William R. & Clark, M. H. & Steiner, Peter M., 2008. "Can Nonrandomized Experiments Yield Accurate Answers? A Randomized Experiment Comparing Random and Nonrandom Assignments," Journal of the American Statistical Association, American Statistical Association, vol. 103(484), pages 1334-1344.
    4. Rubin, Donald B., 2008. "Comment: The Design and Analysis of Gold Standard Randomized Experiments," Journal of the American Statistical Association, American Statistical Association, vol. 103(484), pages 1350-1353.
    5. Deaton, Angus & Cartwright, Nancy, 2018. "Understanding and misunderstanding randomized controlled trials," Social Science & Medicine, Elsevier, vol. 210(C), pages 2-21.
    6. James N. Druckman & Thomas J. Leeper, 2012. "Learning More from Political Communication Experiments: Pretreatment and Its Effects," American Journal of Political Science, John Wiley & Sons, vol. 56(4), pages 875-896, October.
    7. Donald B. Rubin, 2005. "Causal Inference Using Potential Outcomes: Design, Modeling, Decisions," Journal of the American Statistical Association, American Statistical Association, vol. 100, pages 322-331, March.
    8. Sander van der Linden & Anthony Leiserowitz & Edward Maibach, 2018. "Scientific agreement can neutralize politicization of facts," Nature Human Behaviour, Nature, vol. 2(1), pages 2-3, January.
    9. Jacob M. Montgomery & Brendan Nyhan & Michelle Torres, 2018. "How Conditioning on Posttreatment Variables Can Ruin Your Experiment and What to Do about It," American Journal of Political Science, John Wiley & Sons, vol. 62(3), pages 760-775, July.
    Full references (including those not matched with items on IDEAS)

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Eszter Czibor & David Jimenez‐Gomez & John A. List, 2019. "The Dozen Things Experimental Economists Should Do (More of)," Southern Economic Journal, John Wiley & Sons, vol. 86(2), pages 371-432, October.
    2. Kari Lock Morgan & Donald B. Rubin, 2015. "Rerandomization to Balance Tiers of Covariates," Journal of the American Statistical Association, Taylor & Francis Journals, vol. 110(512), pages 1412-1421, December.
    3. Leandro de Magalhaes & Salomo Hirvonen, 2019. "The Incumbent-Challenger Advantage and the Winner-Runner-up Advantage," Bristol Economics Discussion Papers 19/710, School of Economics, University of Bristol, UK.
    4. Yasemin Kisbu-Sakarya & Thomas D. Cook & Yang Tang & M. H. Clark, 2018. "Comparative Regression Discontinuity: A Stress Test With Small Samples," Evaluation Review, , vol. 42(1), pages 111-143, February.
    5. Jordan H. Rickles, 2011. "Using Interviews to Understand the Assignment Mechanism in a Nonexperimental Study," Evaluation Review, , vol. 35(5), pages 490-522, October.
    6. Jared Coopersmith & Thomas D. Cook & Jelena Zurovac & Duncan Chaplin & Lauren V. Forrow, 2022. "Internal And External Validity Of The Comparative Interrupted Time‐Series Design: A Meta‐Analysis," Journal of Policy Analysis and Management, John Wiley & Sons, Ltd., vol. 41(1), pages 252-277, January.
    7. Pietrzyk, Irena & Erdmann, Melinda, 2020. "Investigating the impact of interventions on educational disparities: Estimating average treatment effects (ATEs) is not sufficient," EconStor Open Access Articles and Book Chapters, ZBW - Leibniz Information Centre for Economics, vol. 65, pages 1-1.
    8. Justman, Moshe, 2018. "Randomized controlled trials informing public policy: Lessons from project STAR and class size reduction," European Journal of Political Economy, Elsevier, vol. 54(C), pages 167-174.
    9. Jordan H. Rickles & Michael Seltzer, 2014. "A Two-Stage Propensity Score Matching Strategy for Treatment Effect Estimation in a Multisite Observational Study," Journal of Educational and Behavioral Statistics, , vol. 39(6), pages 612-636, December.
    10. Moshe Justman, 2016. "Economic Research and Education Policy: Project STAR and Class Size Reduction," Melbourne Institute Working Paper Series wp2016n37, Melbourne Institute of Applied Economic and Social Research, The University of Melbourne.
    11. Abel Brodeur & Nikolai Cook & Anthony Heyes, 2020. "Methods Matter: p-Hacking and Publication Bias in Causal Analysis in Economics," American Economic Review, American Economic Association, vol. 110(11), pages 3634-3660, November.
    12. Jinglong Zhao, 2024. "Experimental Design For Causal Inference Through An Optimization Lens," Papers 2408.09607, arXiv.org, revised Aug 2024.
    13. Cousineau, Martin & Verter, Vedat & Murphy, Susan A. & Pineau, Joelle, 2023. "Estimating causal effects with optimization-based methods: A review and empirical comparison," European Journal of Operational Research, Elsevier, vol. 304(2), pages 367-380.
    14. Kathryn N. Vasilaky & J. Michelle Brock, 2020. "Power(ful) guidelines for experimental economists," Journal of the Economic Science Association, Springer;Economic Science Association, vol. 6(2), pages 189-212, December.
    15. Ali Tafti & Galit Shmueli, 2020. "Beyond Overall Treatment Effects: Leveraging Covariates in Randomized Experiments Guided by Causal Structure," Information Systems Research, INFORMS, vol. 31(4), pages 1183-1199, December.
    16. Georgia S. Papoutsi & Stathis Klonaris & Andreas C. Drichoutis, 2018. "The health-taste trade-off in consumer decision making: An experimental approach," Working Papers 2018-2, Agricultural University of Athens, Department Of Agricultural Economics.
    17. Fels, Katja M., 2021. "Who nudges whom? Field experiments with public partners," Ruhr Economic Papers 906, RWI - Leibniz-Institut für Wirtschaftsforschung, Ruhr-University Bochum, TU Dortmund University, University of Duisburg-Essen.
    18. Christopher Bockel-Rickermann & Sam Verboven & Tim Verdonck & Wouter Verbeke, 2023. "A Causal Perspective on Loan Pricing: Investigating the Impacts of Selection Bias on Identifying Bid-Response Functions," Papers 2309.03730, arXiv.org.
    19. Martin Cousineau & Vedat Verter & Susan A. Murphy & Joelle Pineau, 2022. "Estimating causal effects with optimization-based methods: A review and empirical comparison," Papers 2203.00097, arXiv.org.
    20. Vivian C. Wong & Peter M. Steiner & Kylie L. Anglin, 2018. "What Can Be Learned From Empirical Evaluations of Nonexperimental Methods?," Evaluation Review, , vol. 42(2), pages 147-175, April.

    More about this item

    NEP fields

    This paper has been announced in the following NEP Reports:

    Statistics

    Access and download statistics

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:osf:osfxxx:s2j4r. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: OSF (email available below). General contact details of provider: https://osf.io/preprints/ .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.