IDEAS home Printed from https://ideas.repec.org/p/osf/metaar/yxba5.html
   My bibliography  Save this paper

The Ultimate Cause of the “Reproducibility Crisis”: Reductionist Statistics

Author

Listed:
  • Sadri, Arash

Abstract

Resolving the "replication crisis" is a top priority of the scientific community now. "Reproducibility" is claimed as a central tenet of science and the estimated economic and social burden is huge. Numerous proposals have been made. Still, there lacks not only an established solution but even an agreement on whether there exists a "crisis" or not. Here, by questioning the philosophical foundations of our study designs and analyses, I trace back the "crisis" to reductionist ontologies and methodologies ingrained in the modern statistical methods which have dominated biological, medical, psychological, and social sciences for a century. The crisis is not our inability to "reproduce" results but that we expect to be able to "reproduce" results despite neglecting almost all individual-level and contextual variables of complex processes.

Suggested Citation

  • Sadri, Arash, 2022. "The Ultimate Cause of the “Reproducibility Crisis”: Reductionist Statistics," MetaArXiv yxba5, Center for Open Science.
  • Handle: RePEc:osf:metaar:yxba5
    DOI: 10.31219/osf.io/yxba5
    as

    Download full text from publisher

    File URL: https://osf.io/download/6266a2a82aff521d7cb6a9b6/
    Download Restriction: no

    File URL: https://libkey.io/10.31219/osf.io/yxba5?utm_source=ideas
    LibKey link: if access is restricted and if your library uses this service, LibKey will redirect you to where you can use your library subscription to access this item
    ---><---

    References listed on IDEAS

    as
    1. Blakeley B. McShane & David Gal, 2017. "Rejoinder: Statistical Significance and the Dichotomization of Evidence," Journal of the American Statistical Association, Taylor & Francis Journals, vol. 112(519), pages 904-908, July.
    2. Nicole C Nelson & Kelsey Ichikawa & Julie Chung & Momin M Malik, 2021. "Mapping the discursive dimensions of the reproducibility crisis: A mixed methods analysis," PLOS ONE, Public Library of Science, vol. 16(7), pages 1-20, July.
    3. Daniel J. Benjamin & James O. Berger & Magnus Johannesson & Brian A. Nosek & E.-J. Wagenmakers & Richard Berk & Kenneth A. Bollen & Björn Brembs & Lawrence Brown & Colin Camerer & David Cesarini & Chr, 2018. "Redefine statistical significance," Nature Human Behaviour, Nature, vol. 2(1), pages 6-10, January.
      • Daniel Benjamin & James Berger & Magnus Johannesson & Brian Nosek & E. Wagenmakers & Richard Berk & Kenneth Bollen & Bjorn Brembs & Lawrence Brown & Colin Camerer & David Cesarini & Christopher Chambe, 2017. "Redefine Statistical Significance," Artefactual Field Experiments 00612, The Field Experiments Website.
    4. Alessandro Selvitella, 2017. "The ubiquity of the Simpson’s Paradox," Journal of Statistical Distributions and Applications, Springer, vol. 4(1), pages 1-16, December.
    5. Leonard P Freedman & Iain M Cockburn & Timothy S Simcoe, 2015. "The Economics of Reproducibility in Preclinical Research," PLOS Biology, Public Library of Science, vol. 13(6), pages 1-9, June.
    6. David Colquhoun, 2019. "The False Positive Risk: A Proposal Concerning What to Do About p-Values," The American Statistician, Taylor & Francis Journals, vol. 73(S1), pages 192-201, March.
    7. Valentin Amrhein & David Trafimow & Sander Greenland, 2019. "Inferential Statistics as Descriptive Statistics: There Is No Replication Crisis if We Don’t Expect Replication," The American Statistician, Taylor & Francis Journals, vol. 73(S1), pages 262-270, March.
    8. Raymond Hubbard & Brian D. Haig & Rahul A. Parsa, 2019. "The Limited Role of Formal Statistical Inference in Scientific Inference," The American Statistician, Taylor & Francis Journals, vol. 73(S1), pages 91-98, March.
    9. Bernhard Voelkl & Lucile Vogt & Emily S Sena & Hanno Würbel, 2018. "Reproducibility of preclinical animal research improves with heterogeneity of study samples," PLOS Biology, Public Library of Science, vol. 16(2), pages 1-13, February.
    10. Takuji Usui & Malcolm R Macleod & Sarah K McCann & Alistair M Senior & Shinichi Nakagawa, 2021. "Meta-analysis of variation suggests that embracing variability improves both replicability and generalizability in preclinical research," PLOS Biology, Public Library of Science, vol. 19(5), pages 1-20, May.
    11. Rotem Botvinik-Nezer & Felix Holzmeister & Colin F. Camerer & Anna Dreber & Juergen Huber & Magnus Johannesson & Michael Kirchler & Roni Iwanir & Jeanette A. Mumford & R. Alison Adcock & Paolo Avesani, 2020. "Variability in the analysis of a single neuroimaging dataset by many teams," Nature, Nature, vol. 582(7810), pages 84-88, June.
    12. Matthew J Page & Larissa Shamseer & Douglas G Altman & Jennifer Tetzlaff & Margaret Sampson & Andrea C Tricco & Ferrán Catalá-López & Lun Li & Emma K Reid & Rafael Sarkis-Onofre & David Moher, 2016. "Epidemiology and Reporting Characteristics of Systematic Reviews of Biomedical Research: A Cross-Sectional Study," PLOS Medicine, Public Library of Science, vol. 13(5), pages 1-30, May.
    13. Cartwright,Nancy, 1999. "The Dappled World," Cambridge Books, Cambridge University Press, number 9780521643368, November.
    14. Blakeley B. McShane & Jennifer L. Tackett & Ulf Böckenholt & Andrew Gelman, 2019. "Large-Scale Replication Projects in Contemporary Psychological Research," The American Statistician, Taylor & Francis Journals, vol. 73(S1), pages 99-105, March.
    15. Cartwright,Nancy, 1999. "The Dappled World," Cambridge Books, Cambridge University Press, number 9780521644112, November.
    16. Scott Marek & Brenden Tervo-Clemmens & Finnegan J. Calabro & David F. Montez & Benjamin P. Kay & Alexander S. Hatoum & Meghan Rose Donohue & William Foran & Ryland L. Miller & Timothy J. Hendrickson &, 2022. "Reproducible brain-wide association studies require thousands of individuals," Nature, Nature, vol. 603(7902), pages 654-660, March.
    17. Leonid Hanin, 2021. "Cavalier Use of Inferential Statistics Is a Major Source of False and Irreproducible Scientific Findings," Mathematics, MDPI, vol. 9(6), pages 1-13, March.
    18. Nosek, Brian A. & Ebersole, Charles R. & DeHaven, Alexander Carl & Mellor, David Thomas, 2018. "The Preregistration Revolution," OSF Preprints 2dxu5, Center for Open Science.
    19. Shareen A Iqbal & Joshua D Wallach & Muin J Khoury & Sheri D Schully & John P A Ioannidis, 2016. "Reproducible Research Practices and Transparency across the Biomedical Literature," PLOS Biology, Public Library of Science, vol. 14(1), pages 1-13, January.
    20. Blakeley B. McShane & David Gal, 2017. "Statistical Significance and the Dichotomization of Evidence," Journal of the American Statistical Association, Taylor & Francis Journals, vol. 112(519), pages 885-895, July.
    Full references (including those not matched with items on IDEAS)

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Colin F. Camerer & Anna Dreber & Felix Holzmeister & Teck-Hua Ho & Jürgen Huber & Magnus Johannesson & Michael Kirchler & Gideon Nave & Brian A. Nosek & Thomas Pfeiffer & Adam Altmejd & Nick Buttrick , 2018. "Evaluating the replicability of social science experiments in Nature and Science between 2010 and 2015," Nature Human Behaviour, Nature, vol. 2(9), pages 637-644, September.
    2. Christopher Allen & David M A Mehler, 2019. "Open science challenges, benefits and tips in early career and beyond," PLOS Biology, Public Library of Science, vol. 17(5), pages 1-14, May.
    3. Tom Engsted, 2024. "What Is the False Discovery Rate in Empirical Research?," Econ Journal Watch, Econ Journal Watch, vol. 21(1), pages 1-92–112, March.
    4. Gunter, Ulrich & Önder, Irem & Smeral, Egon, 2019. "Scientific value of econometric tourism demand studies," Annals of Tourism Research, Elsevier, vol. 78(C), pages 1-1.
    5. Felix Holzmeister & Magnus Johannesson & Robert Böhm & Anna Dreber & Jürgen Huber & Michael Kirchler, 2023. "Heterogeneity in effect size estimates: Empirical evidence and practical implications," Working Papers 2023-17, Faculty of Economics and Statistics, Universität Innsbruck.
    6. Luigi Pace & Alessandra Salvan, 2020. "Likelihood, Replicability and Robbins' Confidence Sequences," International Statistical Review, International Statistical Institute, vol. 88(3), pages 599-615, December.
    7. Glenn Shafer, 2021. "Testing by betting: A strategy for statistical and scientific communication," Journal of the Royal Statistical Society Series A, Royal Statistical Society, vol. 184(2), pages 407-431, April.
    8. Strømland, Eirik, 2019. "Preregistration and reproducibility," Journal of Economic Psychology, Elsevier, vol. 75(PA).
    9. Giandomenica Becchio, 2020. "The Two Blades of Occam's Razor in Economics: Logical and Heuristic," Economic Thought, World Economics Association, vol. 9(1), pages 1-17, July.
    10. Julian Reiss, 2001. "Natural economic quantities and their measurement," Journal of Economic Methodology, Taylor & Francis Journals, vol. 8(2), pages 287-311.
    11. Matthew Rosenblatt & Link Tejavibulya & Rongtao Jiang & Stephanie Noble & Dustin Scheinost, 2024. "Data leakage inflates prediction performance in connectome-based machine learning models," Nature Communications, Nature, vol. 15(1), pages 1-15, December.
    12. Elbæk, Christian T. & Lystbæk, Martin Nørhede & Mitkidis, Panagiotis, 2022. "On the psychology of bonuses: The effects of loss aversion and Yerkes-Dodson law on performance in cognitively and mechanically demanding tasks," Journal of Behavioral and Experimental Economics (formerly The Journal of Socio-Economics), Elsevier, vol. 98(C).
    13. Bettina Bert & Céline Heinl & Justyna Chmielewska & Franziska Schwarz & Barbara Grune & Andreas Hensel & Matthias Greiner & Gilbert Schönfelder, 2019. "Refining animal research: The Animal Study Registry," PLOS Biology, Public Library of Science, vol. 17(10), pages 1-12, October.
    14. Aumann, Craig A., 2007. "A methodology for developing simulation models of complex systems," Ecological Modelling, Elsevier, vol. 202(3), pages 385-396.
    15. Florian Ellsaesser & Eric W. K. Tsang & Jochen Runde, 2014. "Models of causal inference: Imperfect but applicable is better than perfect but inapplicable," Strategic Management Journal, Wiley Blackwell, vol. 35(10), pages 1541-1551, October.
    16. Stephen Pratten, 2007. "Realism, closed systems and abstraction," Journal of Economic Methodology, Taylor & Francis Journals, vol. 14(4), pages 473-497.
    17. Kinouchi, Renato, 2018. "Philosophical issues related to risks and values," LSE Research Online Documents on Economics 90470, London School of Economics and Political Science, LSE Library.
    18. Bertoldi, Paolo & Mosconi, Rocco, 2020. "Do energy efficiency policies save energy? A new approach based on energy policy indicators (in the EU Member States)," Energy Policy, Elsevier, vol. 139(C).
    19. Maier, Maximilian & VanderWeele, Tyler & Mathur, Maya B, 2021. "Using Selection Models to Assess Sensitivity to Publication Bias: A Tutorial and Call for More Routine Use," MetaArXiv tp45u, Center for Open Science.
    20. Marcel Boumans & Mary Morgan, 2002. "Ceteris paribus conditions: materiality and the application of economic theories," Journal of Economic Methodology, Taylor & Francis Journals, vol. 8(1), pages 11-26.

    More about this item

    NEP fields

    This paper has been announced in the following NEP Reports:

    Statistics

    Access and download statistics

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:osf:metaar:yxba5. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: OSF (email available below). General contact details of provider: https://osf.io/preprints/metaarxiv .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.