IDEAS home Printed from https://ideas.repec.org/a/plo/pone00/0136088.html
   My bibliography  Save this article

Replication, Communication, and the Population Dynamics of Scientific Discovery

Author

Listed:
  • Richard McElreath
  • Paul E Smaldino

Abstract

Many published research results are false (Ioannidis, 2005), and controversy continues over the roles of replication and publication policy in improving the reliability of research. Addressing these problems is frustrated by the lack of a formal framework that jointly represents hypothesis formation, replication, publication bias, and variation in research quality. We develop a mathematical model of scientific discovery that combines all of these elements. This model provides both a dynamic model of research as well as a formal framework for reasoning about the normative structure of science. We show that replication may serve as a ratchet that gradually separates true hypotheses from false, but the same factors that make initial findings unreliable also make replications unreliable. The most important factors in improving the reliability of research are the rate of false positives and the base rate of true hypotheses, and we offer suggestions for addressing each. Our results also bring clarity to verbal debates about the communication of research. Surprisingly, publication bias is not always an obstacle, but instead may have positive impacts—suppression of negative novel findings is often beneficial. We also find that communication of negative replications may aid true discovery even when attempts to replicate have diminished power. The model speaks constructively to ongoing debates about the design and conduct of science, focusing analysis and discussion on precise, internally consistent models, as well as highlighting the importance of population dynamics.

Suggested Citation

  • Richard McElreath & Paul E Smaldino, 2015. "Replication, Communication, and the Population Dynamics of Scientific Discovery," PLOS ONE, Public Library of Science, vol. 10(8), pages 1-16, August.
  • Handle: RePEc:plo:pone00:0136088
    DOI: 10.1371/journal.pone.0136088
    as

    Download full text from publisher

    File URL: https://journals.plos.org/plosone/article?id=10.1371/journal.pone.0136088
    Download Restriction: no

    File URL: https://journals.plos.org/plosone/article/file?id=10.1371/journal.pone.0136088&type=printable
    Download Restriction: no

    File URL: https://libkey.io/10.1371/journal.pone.0136088?utm_source=ideas
    LibKey link: if access is restricted and if your library uses this service, LibKey will redirect you to where you can use your library subscription to access this item
    ---><---

    References listed on IDEAS

    as
    1. John P A Ioannidis, 2005. "Why Most Published Research Findings Are False," PLOS Medicine, Public Library of Science, vol. 2(8), pages 1-1, August.
    2. C. Glenn Begley & Lee M. Ellis, 2012. "Raise standards for preclinical cancer research," Nature, Nature, vol. 483(7391), pages 531-533, March.
    3. Mina Bissell, 2013. "Reproducibility: The risks of the replication drive," Nature, Nature, vol. 503(7476), pages 333-334, November.
    Full references (including those not matched with items on IDEAS)

    Citations

    Citations are extracted by the CitEc Project, subscribe to its RSS feed for this item.
    as


    Cited by:

    1. Alexander Frankel & Maximilian Kasy, 2022. "Which Findings Should Be Published?," American Economic Journal: Microeconomics, American Economic Association, vol. 14(1), pages 1-38, February.
    2. Leonid Tiokhin & Minhua Yan & Thomas J. H. Morgan, 2021. "Competition for priority harms the reliability of science, but reforms can help," Nature Human Behaviour, Nature, vol. 5(7), pages 857-867, July.
    3. Merton S. Krause, 2019. "Replication and preregistration," Quality & Quantity: International Journal of Methodology, Springer, vol. 53(5), pages 2647-2652, September.
    4. Riko Kelter, 2021. "Analysis of type I and II error rates of Bayesian and frequentist parametric and nonparametric two-sample hypothesis tests under preliminary assessment of normality," Computational Statistics, Springer, vol. 36(2), pages 1263-1288, June.
    5. Mauricio González-Forero & Timm Faulwasser & Laurent Lehmann, 2017. "A model for brain life history evolution," PLOS Computational Biology, Public Library of Science, vol. 13(3), pages 1-28, March.
    6. Kelter, Riko, 2022. "Power analysis and type I and type II error rates of Bayesian nonparametric two-sample tests for location-shifts based on the Bayes factor under Cauchy priors," Computational Statistics & Data Analysis, Elsevier, vol. 165(C).
    7. Morgan, Thomas J. H. & Smaldino, Paul E., 2024. "Author-Paid Publication Fees Corrupt Science and Should Be Abandoned," OSF Preprints 3ez9v, Center for Open Science.
    8. Van Dooren, Wouter & Noordegraaf, Mirko, 2020. "Staging Science: Authoritativeness and Fragility of Models and Measurement in the COVID-19 Crisis," SocArXiv nfm5j, Center for Open Science.
    9. Riko Kelter, 2022. "A New Bayesian Two-Sample t Test and Solution to the Behrens–Fisher Problem Based on Gaussian Mixture Modelling with Known Allocations," Statistics in Biosciences, Springer;International Chinese Statistical Association, vol. 14(3), pages 380-412, December.

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Colin F. Camerer & Anna Dreber & Felix Holzmeister & Teck-Hua Ho & Jürgen Huber & Magnus Johannesson & Michael Kirchler & Gideon Nave & Brian A. Nosek & Thomas Pfeiffer & Adam Altmejd & Nick Buttrick , 2018. "Evaluating the replicability of social science experiments in Nature and Science between 2010 and 2015," Nature Human Behaviour, Nature, vol. 2(9), pages 637-644, September.
    2. Bettina Bert & Céline Heinl & Justyna Chmielewska & Franziska Schwarz & Barbara Grune & Andreas Hensel & Matthias Greiner & Gilbert Schönfelder, 2019. "Refining animal research: The Animal Study Registry," PLOS Biology, Public Library of Science, vol. 17(10), pages 1-12, October.
    3. Hajko, Vladimír, 2017. "The failure of Energy-Economy Nexus: A meta-analysis of 104 studies," Energy, Elsevier, vol. 125(C), pages 771-787.
    4. Maren Duvendack & Richard W. Palmer-Jones & W. Robert Reed, 2015. "Replications in Economics: A Progress Report," Econ Journal Watch, Econ Journal Watch, vol. 12(2), pages 164–191-1, May.
    5. Oliver Braganza, 2020. "A simple model suggesting economically rational sample-size choice drives irreproducibility," PLOS ONE, Public Library of Science, vol. 15(3), pages 1-19, March.
    6. Marlo M Vernon & E Andrew Balas & Shaher Momani, 2018. "Are university rankings useful to improve research? A systematic review," PLOS ONE, Public Library of Science, vol. 13(3), pages 1-15, March.
    7. Mueller-Langer, Frank & Fecher, Benedikt & Harhoff, Dietmar & Wagner, Gert G., 2019. "Replication studies in economics—How many and which papers are chosen for replication, and why?," Research Policy, Elsevier, vol. 48(1), pages 62-83.
    8. Bernhard Voelkl & Lucile Vogt & Emily S Sena & Hanno Würbel, 2018. "Reproducibility of preclinical animal research improves with heterogeneity of study samples," PLOS Biology, Public Library of Science, vol. 16(2), pages 1-13, February.
    9. Koessler, Ann-Kathrin & Page, Lionel & Dulleck, Uwe, 2015. "Promoting pro-social behavior with public statements of good intent," MPRA Paper 80072, University Library of Munich, Germany, revised 24 May 2017.
    10. Ádám Kun, 2018. "Publish and Who Should Perish: You or Science?," Publications, MDPI, vol. 6(2), pages 1-16, April.
    11. Mark D Lindner & Richard K Nakamura, 2015. "Examining the Predictive Validity of NIH Peer Review Scores," PLOS ONE, Public Library of Science, vol. 10(6), pages 1-12, June.
    12. Camerer, Colin & Dreber, Anna & Forsell, Eskil & Ho, Teck-Hua & Huber, Jurgen & Johannesson, Magnus & Kirchler, Michael & Almenberg, Johan & Altmejd, Adam & Chan, Taizan & Heikensten, Emma & Holzmeist, 2016. "Evaluating replicability of laboratory experiments in Economics," MPRA Paper 75461, University Library of Munich, Germany.
    13. Isabelle Bartram & Jonathan M Jeschke, 2019. "Do cancer stem cells exist? A pilot study combining a systematic review with the hierarchy-of-hypotheses approach," PLOS ONE, Public Library of Science, vol. 14(12), pages 1-12, December.
    14. Andrea Saltelli & Monica Fiore, 2020. "From sociology of quantification to ethics of quantification," Palgrave Communications, Palgrave Macmillan, vol. 7(1), pages 1-8, December.
    15. Didier Sornette & Spencer Wheatley & Peter Cauwels, 2019. "The Fair Reward Problem: The Illusion Of Success And How To Solve It," Advances in Complex Systems (ACS), World Scientific Publishing Co. Pte. Ltd., vol. 22(03), pages 1-52, May.
    16. van Aert, Robbie Cornelis Maria, 2018. "Dissertation R.C.M. van Aert," MetaArXiv eqhjd, Center for Open Science.
    17. Santiago Sanchez-Pages & Claudia Rodriguez-Ruiz & Enrique Turiegano, 2014. "Facial Masculinity: How the Choice of Measurement Method Enables to Detect Its Influence on Behaviour," PLOS ONE, Public Library of Science, vol. 9(11), pages 1-10, November.
    18. Koessler, Ann-Kathrin & Page, Lionel & Dulleck, Uwe, 2018. "Public Statements of Good Conduct Promote Pro-Social Behavior," EconStor Preprints 180669, ZBW - Leibniz Information Centre for Economics.
    19. Andrew D Higginson & Marcus R Munafò, 2016. "Current Incentives for Scientists Lead to Underpowered Studies with Erroneous Conclusions," PLOS Biology, Public Library of Science, vol. 14(11), pages 1-14, November.
    20. Jinzhou Li & Marloes H. Maathuis, 2021. "GGM knockoff filter: False discovery rate control for Gaussian graphical models," Journal of the Royal Statistical Society Series B, Royal Statistical Society, vol. 83(3), pages 534-558, July.

    More about this item

    Statistics

    Access and download statistics

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:plo:pone00:0136088. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: plosone (email available below). General contact details of provider: https://journals.plos.org/plosone/ .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.