IDEAS home Printed from https://ideas.repec.org/p/osf/socarx/rkyf7_v1.html
   My bibliography  Save this paper

Revisiting the replication crisis without false positives

Author

Listed:
  • Bak-Coleman, Joseph B

    (University of Washington)

  • Mann, Richard P.
  • Bergstrom, Carl T.
  • Gross, Kevin
  • West, Jevin

Abstract

Efforts to replicate portions of the scientific literature have lead to widely varying and often low rates of replicability. This has raised concerns over a ``replication crisis'' whereby many of the statistically significant claims in the published literature are thought to be false positives, due to some combination of publication bias and widespread use of questionable research practices. However, formal meta-scientific models invoking false positives lead to conclusions that often conflict with observational findings and require additional assumptions to reconcile varying rates of replicability across areas of research. Here, we present a minimal, alternative model of how replication failures can occur even in the absence of false positives. Using our model, we show that variation in estimates of replicability across social science is well explained as an artifact of replication sample size. We additionally demonstrate that key features of reformed science and multi-site replications can be explained without false positives. Our results are consistent with evidence suggesting that file-drawer sizes are likely much smaller, and Questionable Research Practices less abundant, than required by false positive models. We anticipate our findings will be a starting point for more formal and nuanced discussion of the health of the scientific literature and areas for improvement.

Suggested Citation

  • Bak-Coleman, Joseph B & Mann, Richard P. & Bergstrom, Carl T. & Gross, Kevin & West, Jevin, 2022. "Revisiting the replication crisis without false positives," SocArXiv rkyf7_v1, Center for Open Science.
  • Handle: RePEc:osf:socarx:rkyf7_v1
    DOI: 10.31219/osf.io/rkyf7_v1
    as

    Download full text from publisher

    File URL: https://osf.io/download/62696b330890d80ba47b13b5/
    Download Restriction: no

    File URL: https://libkey.io/10.31219/osf.io/rkyf7_v1?utm_source=ideas
    LibKey link: if access is restricted and if your library uses this service, LibKey will redirect you to where you can use your library subscription to access this item
    ---><---

    References listed on IDEAS

    as
    1. Abel Brodeur & Scott Carrell & David Figlio & Lester Lusher, 2023. "Unpacking P-hacking and Publication Bias," American Economic Review, American Economic Association, vol. 113(11), pages 2974-3002, November.
    2. Gall, Thomas & Maniadis, Zacharias, 2019. "Evaluating solutions to the problem of false positives," Research Policy, Elsevier, vol. 48(2), pages 506-515.
    3. Brodeur, Abel & Cook, Nikolai & Hartley, Jonathan & Heyes, Anthony, 2022. "Do Pre-Registration and Pre-analysis Plans Reduce p-Hacking and Publication Bias?," MetaArXiv uxf39_v1, Center for Open Science.
    4. Brodeur, Abel & Cook, Nikolai M. & Hartley, Jonathan S. & Heyes, Anthony, 2022. "Do Pre-Registration and Pre-analysis Plans Reduce p-Hacking and Publication Bias?," GLO Discussion Paper Series 1147, Global Labor Organization (GLO).
    5. Joel Ferguson & Rebecca Littman & Garret Christensen & Elizabeth Levy Paluck & Nicholas Swanson & Zenan Wang & Edward Miguel & David Birke & John-Henry Pezzuto, 2023. "Survey of open science practices and attitudes in the social sciences," Nature Communications, Nature, vol. 14(1), pages 1-13, December.
    6. Daniël Lakens & Eline N. F. Ensinck, 2024. "Make abandoned research publicly available," Nature Human Behaviour, Nature, vol. 8(4), pages 609-610, April.
    Full references (including those not matched with items on IDEAS)

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Rose, Julian & Neubauer, Florian & Ankel-Peters, Jörg, 2024. "Long-Term Effects of the Targeting the Ultra-Poor Program - A Reproducibility and Replicability Assessment of Banerjee et al. (2021)," I4R Discussion Paper Series 142, The Institute for Replication (I4R).
    2. Danielle V. Handel & Eric A. Hanushek, 2024. "Contexts of Convenience: Generalizing from Published Evaluations of School Finance Policies," Evaluation Review, , vol. 48(3), pages 461-494, June.
    3. Abel Brodeur & Scott Carrell & David Figlio & Lester Lusher, 2023. "Unpacking P-hacking and Publication Bias," American Economic Review, American Economic Association, vol. 113(11), pages 2974-3002, November.
    4. Zuzana Irsova & Hristos Doucouliagos & Tomas Havranek & T. D. Stanley, 2023. "Meta-Analysis of Social Science Research: A Practitioner´s Guide," Working Papers IES 2023/25, Charles University Prague, Faculty of Social Sciences, Institute of Economic Studies, revised Sep 2023.
    5. Felix Holzmeister & Magnus Johannesson & Robert Böhm & Anna Dreber & Jürgen Huber & Michael Kirchler, 2023. "Heterogeneity in effect size estimates: Empirical evidence and practical implications," Working Papers 2023-17, Faculty of Economics and Statistics, Universität Innsbruck.
    6. Salandra, Rossella & Criscuolo, Paola & Salter, Ammon, 2021. "Directing scientists away from potentially biased publications: the role of systematic reviews in health care," Research Policy, Elsevier, vol. 50(1).
    7. Abel Brodeur & Nikolai M. Cook & Jonathan S. Hartley & Anthony Heyes, 2024. "Do Preregistration and Preanalysis Plans Reduce p-Hacking and Publication Bias? Evidence from 15,992 Test Statistics and Suggestions for Improvement," Journal of Political Economy Microeconomics, University of Chicago Press, vol. 2(3), pages 527-561.
    8. Sam Sims & Jake Anders & Matthew Inglis & Hugues Lortie-Forgues & Ben Styles & Ben Weidmann, 2023. "Experimental education research: rethinking why, how and when to use random assignment," CEPEO Working Paper Series 23-07, UCL Centre for Education Policy and Equalising Opportunities, revised Aug 2023.
    9. Libman, A., 2024. ""Zoo" of empirical results: Quantitative research and accumulation of knowledge in social sciences," Journal of the New Economic Association, New Economic Association, vol. 65(4), pages 178-194.
    10. Strømland, Eirik, 2019. "Preregistration and reproducibility," Journal of Economic Psychology, Elsevier, vol. 75(PA).
    11. Costanza Naguib, 2024. "P-hacking and Significance Stars," Diskussionsschriften dp2409, Universitaet Bern, Departement Volkswirtschaft.
    12. Irsova, Zuzana & Bom, Pedro Ricardo Duarte & Havranek, Tomas & Rachinger, Heiko, 2023. "Spurious Precision in Meta-Analysis," MetaArXiv 3qp2w, Center for Open Science.
    13. Dreber, Anna & Johannesson, Magnus, 2023. "A framework for evaluating reproducibility and replicability in economics," I4R Discussion Paper Series 38, The Institute for Replication (I4R).
    14. Herresthal, Claudia, 2022. "Hidden testing and selective disclosure of evidence," Journal of Economic Theory, Elsevier, vol. 200(C).
    15. Thibaut Arpinon & Romain Espinosa, 2023. "A practical guide to Registered Reports for economists," Journal of the Economic Science Association, Springer;Economic Science Association, vol. 9(1), pages 90-122, June.
    16. Thibaut Arpinon & Romain Espinosa, 2023. "A Practical Guide to Registered Reports for Economists," Post-Print halshs-03897719, HAL.
    17. Horton, Joanne & Krishna Kumar, Dhanya & Wood, Anthony, 2020. "Detecting academic fraud using Benford law: The case of Professor James Hunton," Research Policy, Elsevier, vol. 49(8).
    18. Jordan C. Stanley & Evan S. Totty, 2024. "Synthetic Data and Social Science Research: Accuracy Assessments and Practical Considerations from the SIPP Synthetic Beta," NBER Chapters, in: Data Privacy Protection and the Conduct of Applied Research: Methods, Approaches and their Consequences, National Bureau of Economic Research, Inc.
    19. Emilija Stojmenova Duh & Andrej Duh & Uroš Droftina & Tim Kos & Urban Duh & Tanja Simonič Korošak & Dean Korošak, 2019. "Publish-and-Flourish: Using Blockchain Platform to Enable Cooperative Scholarly Communication," Publications, MDPI, vol. 7(2), pages 1-15, May.
    20. Gruener, Sven & Mußhoff, Oliver, 2022. "Do anticipated consequences of climate change affect risk preferences and cooperation? – An experimental study with farmers, students, and representatives of the general population of Germany," OSF Preprints jq57n_v1, Center for Open Science.

    More about this item

    Statistics

    Access and download statistics

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:osf:socarx:rkyf7_v1. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: OSF (email available below). General contact details of provider: https://arabixiv.org .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.