IDEAS home Printed from https://ideas.repec.org/p/osf/metaar/kuhmz_v1.html
   My bibliography  Save this paper

The multiversal methodology as a remedy of the replication crisis

Author

Listed:
  • Cantone, Giulio Giacomo

Abstract

This manuscript is a comprehensive historical and theoretical examination of the development of ‘multiversal methods’ as a response to the replication crisis. Multiversal methods are statistical procedures designed to assess the uncertainty arising from analyst-driven decisions in inferential models based on statistical regressions. The replication crisis is a surge in discovering that many studies fail to replicate the findings of previous studies. Replication crisis has raised concerns about the reliability and credibility of scientific research, particularly in social sciences and medicine. Section I provides a non-technical overview of the design of causal inference based on statistical regressions. Furtherly, it outlines and comments on the procedures to compute multiversal statistics. Section II presents the historical and social context within occurred key epistemological innovations contributing to the development of the theories behind multiversal methods. The section argues why and what these advancements drew from the epistemology of misinformation (‘bullshit epistemology’) for a sense of urgency for remedies to some enduring issues in scientific production: publication bias and p-hacking. Section III is a comment over two relevant works within paradigm of Open Science, to outline the limitations and challenges of this framework.

Suggested Citation

  • Cantone, Giulio Giacomo, 2023. "The multiversal methodology as a remedy of the replication crisis," MetaArXiv kuhmz_v1, Center for Open Science.
  • Handle: RePEc:osf:metaar:kuhmz_v1
    DOI: 10.31219/osf.io/kuhmz_v1
    as

    Download full text from publisher

    File URL: https://osf.io/download/6449c9093848536ac6495ab3/
    Download Restriction: no

    File URL: https://libkey.io/10.31219/osf.io/kuhmz_v1?utm_source=ideas
    LibKey link: if access is restricted and if your library uses this service, LibKey will redirect you to where you can use your library subscription to access this item
    ---><---

    References listed on IDEAS

    as
    1. Maren Duvendack & Richard Palmer-Jones & W. Robert Reed, 2017. "What Is Meant by "Replication" and Why Does It Encounter Resistance in Economics?," American Economic Review, American Economic Association, vol. 107(5), pages 46-51, May.
    2. Abel Brodeur & Nikolai Cook & Anthony Heyes, 2020. "Methods Matter: p-Hacking and Publication Bias in Causal Analysis in Economics," American Economic Review, American Economic Association, vol. 110(11), pages 3634-3660, November.
    3. Erik W. van Zwet & Eric A. Cator, 2021. "The significance filter, the winner's curse and the need to shrink," Statistica Neerlandica, Netherlands Society for Statistics and Operations Research, vol. 75(4), pages 437-452, November.
    4. Cristobal Young, 2019. "The Difference Between Causal Analysis and Predictive Models: Response to “Comment on Young and Holsteen (2017)â€," Sociological Methods & Research, , vol. 48(2), pages 431-447, May.
    5. Lionel Page & Charles N. Noussair & Robert Slonim, 2021. "The replication crisis, the rise of new research practices and what it means for experimental economics," Journal of the Economic Science Association, Springer;Economic Science Association, vol. 7(2), pages 210-225, December.
    6. Leamer, Edward E, 1985. "Sensitivity Analyses Would Help," American Economic Review, American Economic Association, vol. 75(3), pages 308-313, June.
    7. Monya Baker, 2016. "1,500 scientists lift the lid on reproducibility," Nature, Nature, vol. 533(7604), pages 452-454, May.
    8. Guido W. Imbens, 2021. "Statistical Significance, p-Values, and the Reporting of Uncertainty," Journal of Economic Perspectives, American Economic Association, vol. 35(3), pages 157-174, Summer.
    9. Megan L Head & Luke Holman & Rob Lanfear & Andrew T Kahn & Michael D Jennions, 2015. "The Extent and Consequences of P-Hacking in Science," PLOS Biology, Public Library of Science, vol. 13(3), pages 1-15, March.
    10. Lars Leszczensky & Tobias Wolbring, 2022. "How to Deal With Reverse Causality Using Panel Data? Recommendations for Researchers Based on a Simulation Study," Sociological Methods & Research, , vol. 51(2), pages 837-865, May.
    11. Horton, Joanne & Krishna Kumar, Dhanya & Wood, Anthony, 2020. "Detecting academic fraud using Benford law: The case of Professor James Hunton," Research Policy, Elsevier, vol. 49(8).
    12. Camerer, Colin & Dreber, Anna & Forsell, Eskil & Ho, Teck-Hua & Huber, Jurgen & Johannesson, Magnus & Kirchler, Michael & Almenberg, Johan & Altmejd, Adam & Chan, Taizan & Heikensten, Emma & Holzmeist, 2016. "Evaluating replicability of laboratory experiments in Economics," MPRA Paper 75461, University Library of Munich, Germany.
    13. Nosek, Brian A. & Ebersole, Charles R. & DeHaven, Alexander Carl & Mellor, David Thomas, 2018. "The Preregistration Revolution," OSF Preprints 2dxu5, Center for Open Science.
    14. Adam Slez, 2019. "The Difference Between Instability and Uncertainty: Comment on Young and Holsteen (2017)," Sociological Methods & Research, , vol. 48(2), pages 400-430, May.
    Full references (including those not matched with items on IDEAS)

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Cantone, Giulio Giacomo, 2023. "The multiversal methodology as a remedy of the replication crisis," MetaArXiv kuhmz, Center for Open Science.
    2. Guillaume Coqueret, 2023. "Forking paths in financial economics," Papers 2401.08606, arXiv.org.
    3. Giulio Giacomo Cantone & Venera Tomaselli, 2024. "A Multiversal Model of Vibration of Effects of the Equitable and Sustainable Well-Being (BES) on Fertility," Social Indicators Research: An International and Interdisciplinary Journal for Quality-of-Life Measurement, Springer, vol. 175(3), pages 941-964, December.
    4. Jasper Brinkerink, 2023. "When Shooting for the Stars Becomes Aiming for Asterisks: P-Hacking in Family Business Research," Entrepreneurship Theory and Practice, , vol. 47(2), pages 304-343, March.
    5. Graham Elliott & Nikolay Kudrin & Kaspar Wüthrich, 2022. "Detecting p‐Hacking," Econometrica, Econometric Society, vol. 90(2), pages 887-906, March.
    6. Eszter Czibor & David Jimenez‐Gomez & John A. List, 2019. "The Dozen Things Experimental Economists Should Do (More of)," Southern Economic Journal, John Wiley & Sons, vol. 86(2), pages 371-432, October.
    7. Christophe Pérignon & Olivier Akmansoy & Christophe Hurlin & Anna Dreber & Felix Holzmeister & Juergen Huber & Magnus Johanneson & Michael Kirchler & Albert Menkveld & Michael Razen & Utz Weitzel, 2022. "Reproducibility of Empirical Results: Evidence from 1,000 Tests in Finance," Working Papers hal-03810013, HAL.
    8. Graham Elliott & Nikolay Kudrin & Kaspar Wuthrich, 2022. "The Power of Tests for Detecting $p$-Hacking," Papers 2205.07950, arXiv.org, revised Apr 2024.
    9. Thibaut Arpinon & Marianne Lefebvre, 2024. "Registered Reports and Associated Benefits for Agricultural Economics," Post-Print hal-04635986, HAL.
    10. Felix Holzmeister & Magnus Johannesson & Robert Böhm & Anna Dreber & Jürgen Huber & Michael Kirchler, 2023. "Heterogeneity in effect size estimates: Empirical evidence and practical implications," Working Papers 2023-17, Faculty of Economics and Statistics, Universität Innsbruck.
    11. Cantone, Giulio Giacomo & Tomaselli, Venera, 2024. "On the Coherence of Composite Indexes: Multiversal Model and Specification Analysis for an Index of Well-Being," MetaArXiv d5y26, Center for Open Science.
    12. Cantone, Giulio Giacomo & Tomaselli, Venera, 2024. "On the Coherence of Composite Indexes: Multiversal Model and Specification Analysis for an Index of Well-Being," MetaArXiv d5y26_v1, Center for Open Science.
    13. Gechert, Sebastian & Mey, Bianka & Opatrny, Matej & Havranek, Tomas & Stanley, T. D. & Bom, Pedro R. D. & Doucouliagos, Hristos & Heimberger, Philipp & Irsova, Zuzana & Rachinger, Heiko J., 2023. "Conventional Wisdom, Meta-Analysis, and Research Revision in Economics," EconStor Preprints 280745, ZBW - Leibniz Information Centre for Economics.
    14. Freuli, Francesca & Held, Leonhard & Heyard, Rachel, 2022. "Replication success under questionable research practices – a simulation study," MetaArXiv s4b65_v1, Center for Open Science.
    15. Colin F. Camerer & Anna Dreber & Felix Holzmeister & Teck-Hua Ho & Jürgen Huber & Magnus Johannesson & Michael Kirchler & Gideon Nave & Brian A. Nosek & Thomas Pfeiffer & Adam Altmejd & Nick Buttrick , 2018. "Evaluating the replicability of social science experiments in Nature and Science between 2010 and 2015," Nature Human Behaviour, Nature, vol. 2(9), pages 637-644, September.
    16. Bull, Charles & Courty, Pascal & Doyon, Maurice & Rondeau, Daniel, 2019. "Failure of the Becker–DeGroot–Marschak mechanism in inexperienced subjects: New tests of the game form misconception hypothesis," Journal of Economic Behavior & Organization, Elsevier, vol. 159(C), pages 235-253.
    17. Nick Huntington‐Klein & Andreu Arenas & Emily Beam & Marco Bertoni & Jeffrey R. Bloem & Pralhad Burli & Naibin Chen & Paul Grieco & Godwin Ekpe & Todd Pugatch & Martin Saavedra & Yaniv Stopnitzky, 2021. "The influence of hidden researcher decisions in applied microeconomics," Economic Inquiry, Western Economic Association International, vol. 59(3), pages 944-960, July.
    18. Fernando Hoces de la Guardia & Sean Grant & Edward Miguel, 2021. "A framework for open policy analysis," Science and Public Policy, Oxford University Press, vol. 48(2), pages 154-163.
    19. Brodeur, Abel & Cook, Nikolai & Heyes, Anthony, 2022. "We Need to Talk about Mechanical Turk: What 22,989 Hypothesis Tests Tell us about p-Hacking and Publication Bias in Online Experiments," GLO Discussion Paper Series 1157, Global Labor Organization (GLO).
    20. Roggenkamp, Hauke C., 2024. "Revisiting ‘Growth and Inequality in Public Good Provision’—Reproducing and Generalizing Through Inconvenient Online Experimentation," OSF Preprints 6rn97, Center for Open Science.

    More about this item

    Statistics

    Access and download statistics

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:osf:metaar:kuhmz_v1. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: OSF (email available below). General contact details of provider: https://osf.io/preprints/metaarxiv .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.