IDEAS home Printed from https://ideas.repec.org/a/sae/jedbes/v44y2019i5p543-570.html
   My bibliography  Save this article

More Than One Replication Study Is Needed for Unambiguous Tests of Replication

Author

Listed:
  • Larry V. Hedges

    (Northwestern University)

  • Jacob M. Schauer

    (Institute for Policy Research, Northwestern University)

Abstract

The problem of assessing whether experimental results can be replicated is becoming increasingly important in many areas of science. It is often assumed that assessing replication is straightforward: All one needs to do is repeat the study and see whether the results of the original and replication studies agree. This article shows that the statistical test for whether two studies obtain the same effect is smaller than the power of either study to detect an effect in the first place. Thus, unless the original study and the replication study have unusually high power (e.g., power of 98%), a single replication study will not have adequate sensitivity to provide an unambiguous evaluation of replication.

Suggested Citation

  • Larry V. Hedges & Jacob M. Schauer, 2019. "More Than One Replication Study Is Needed for Unambiguous Tests of Replication," Journal of Educational and Behavioral Statistics, , vol. 44(5), pages 543-570, October.
  • Handle: RePEc:sae:jedbes:v:44:y:2019:i:5:p:543-570
    DOI: 10.3102/1076998619852953
    as

    Download full text from publisher

    File URL: https://journals.sagepub.com/doi/10.3102/1076998619852953
    Download Restriction: no

    File URL: https://libkey.io/10.3102/1076998619852953?utm_source=ideas
    LibKey link: if access is restricted and if your library uses this service, LibKey will redirect you to where you can use your library subscription to access this item
    ---><---

    References listed on IDEAS

    as
    1. Robbie C M van Aert & Marcel A L M van Assen, 2017. "Bayesian evaluation of effect size after replicating an original study," PLOS ONE, Public Library of Science, vol. 12(4), pages 1-23, April.
    2. Steve Perrin, 2014. "Preclinical research: Make mouse studies work," Nature, Nature, vol. 507(7493), pages 423-425, March.
    3. Gelman, Andrew & Stern, Hal, 2006. "The Difference Between," The American Statistician, American Statistical Association, vol. 60, pages 328-331, November.
    4. Alexander Etz & Joachim Vandekerckhove, 2016. "A Bayesian Perspective on the Reproducibility Project: Psychology," PLOS ONE, Public Library of Science, vol. 11(2), pages 1-12, February.
    5. Monya Baker, 2016. "1,500 scientists lift the lid on reproducibility," Nature, Nature, vol. 533(7604), pages 452-454, May.
    6. Francis S. Collins & Lawrence A. Tabak, 2014. "Policy: NIH plans to enhance reproducibility," Nature, Nature, vol. 505(7485), pages 612-613, January.
    Full references (including those not matched with items on IDEAS)

    Citations

    Citations are extracted by the CitEc Project, subscribe to its RSS feed for this item.
    as


    Cited by:

    1. Sommet, Nicolas & Weissman, David Laurence & Cheutin, Nicolas & Elliot, Andrew, 2022. "How many participants do i need to test an interaction? Conducting an appropriate power analysis and achieving sufficient power to detect an interaction," OSF Preprints xhe3u, Center for Open Science.
    2. Freuli, Francesca & Held, Leonhard & Heyard, Rachel, 2022. "Replication success under questionable research practices – a simulation study," MetaArXiv s4b65, Center for Open Science.
    3. Samuel Pawel & Leonhard Held, 2022. "The sceptical Bayes factor for the assessment of replication success," Journal of the Royal Statistical Society Series B, Royal Statistical Society, vol. 84(3), pages 879-911, July.
    4. Freuli, Francesca & Held, Leonhard & Heyard, Rachel, 2022. "Replication Success under Questionable Research Practices - A Simulation Study," I4R Discussion Paper Series 2, The Institute for Replication (I4R).
    5. Larry V. Hedges & Jacob M. Schauer, 2021. "The design of replication studies," Journal of the Royal Statistical Society Series A, Royal Statistical Society, vol. 184(3), pages 868-886, July.
    6. Leon C Reteig & Lionel A Newman & K Richard Ridderinkhof & Heleen A Slagter, 2022. "Effects of tDCS on the attentional blink revisited: A statistical evaluation of a replication attempt," PLOS ONE, Public Library of Science, vol. 17(1), pages 1-23, January.

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Larry V. Hedges & Jacob M. Schauer, 2021. "The design of replication studies," Journal of the Royal Statistical Society Series A, Royal Statistical Society, vol. 184(3), pages 868-886, July.
    2. Aaron C Ericsson & J Wade Davis & William Spollen & Nathan Bivens & Scott Givan & Catherine E Hagan & Mark McIntosh & Craig L Franklin, 2015. "Effects of Vendor and Genetic Background on the Composition of the Fecal Microbiota of Inbred Mice," PLOS ONE, Public Library of Science, vol. 10(2), pages 1-19, February.
    3. Colin F. Camerer & Anna Dreber & Felix Holzmeister & Teck-Hua Ho & Jürgen Huber & Magnus Johannesson & Michael Kirchler & Gideon Nave & Brian A. Nosek & Thomas Pfeiffer & Adam Altmejd & Nick Buttrick , 2018. "Evaluating the replicability of social science experiments in Nature and Science between 2010 and 2015," Nature Human Behaviour, Nature, vol. 2(9), pages 637-644, September.
    4. Minh-Hoang Nguyen & Tam-Tri Le & Hong-Kong To Nguyen & Manh-Toan Ho & Huyen T. Thanh Nguyen & Quan-Hoang Vuong, 2021. "Alice in Suicideland: Exploring the Suicidal Ideation Mechanism through the Sense of Connectedness and Help-Seeking Behaviors," IJERPH, MDPI, vol. 18(7), pages 1-24, April.
    5. Dean A Fergusson & Marc T Avey & Carly C Barron & Mathew Bocock & Kristen E Biefer & Sylvain Boet & Stephane L Bourque & Isidora Conic & Kai Chen & Yuan Yi Dong & Grace M Fox & Ronald B George & Neil , 2019. "Reporting preclinical anesthesia study (REPEAT): Evaluating the quality of reporting in the preclinical anesthesiology literature," PLOS ONE, Public Library of Science, vol. 14(5), pages 1-15, May.
    6. Dennis Bontempi & Leonard Nuernberg & Suraj Pai & Deepa Krishnaswamy & Vamsi Thiriveedhi & Ahmed Hosny & Raymond H. Mak & Keyvan Farahani & Ron Kikinis & Andrey Fedorov & Hugo J. W. L. Aerts, 2024. "End-to-end reproducible AI pipelines in radiology using the cloud," Nature Communications, Nature, vol. 15(1), pages 1-9, December.
    7. Lukas Haffert, 2019. "War mobilization or war destruction? The unequal rise of progressive taxation revisited," The Review of International Organizations, Springer, vol. 14(1), pages 59-82, March.
    8. Michael A. Allen & Michael E. Flynn & Julie VanDusky-Allen, 2017. "Regions of Hierarchy and Security: US Troop Deployments, Spatial Relations, and Defense Burdens," International Interactions, Taylor & Francis Journals, vol. 43(3), pages 397-423, May.
    9. Fernando Hoces de la Guardia & Sean Grant & Edward Miguel, 2021. "A framework for open policy analysis," Science and Public Policy, Oxford University Press, vol. 48(2), pages 154-163.
    10. Antonella Lanati & Marinella Marzano & Caterina Manzari & Bruno Fosso & Graziano Pesole & Francesca De Leo, 2019. "Management at the service of research: ReOmicS, a quality management system for omics sciences," Palgrave Communications, Palgrave Macmillan, vol. 5(1), pages 1-13, December.
    11. Joel Ferguson & Rebecca Littman & Garret Christensen & Elizabeth Levy Paluck & Nicholas Swanson & Zenan Wang & Edward Miguel & David Birke & John-Henry Pezzuto, 2023. "Survey of open science practices and attitudes in the social sciences," Nature Communications, Nature, vol. 14(1), pages 1-13, December.
    12. David Spiegelhalter, 2017. "Trust in numbers," Journal of the Royal Statistical Society Series A, Royal Statistical Society, vol. 180(4), pages 948-965, October.
    13. Andrew P. Jaciw & Li Lin & Boya Ma, 2016. "An Empirical Study of Design Parameters for Assessing Differential Impacts for Students in Group Randomized Trials," Evaluation Review, , vol. 40(5), pages 410-443, October.
    14. Fanelli, Daniele, 2020. "Metascientific reproducibility patterns revealed by informatic measure of knowledge," MetaArXiv 5vnhj, Center for Open Science.
    15. Bor Luen Tang, 2023. "Some Insights into the Factors Influencing Continuous Citation of Retracted Scientific Papers," Publications, MDPI, vol. 11(4), pages 1-14, October.
    16. Rosenblatt, Lucas & Herman, Bernease & Holovenko, Anastasia & Lee, Wonkwon & Loftus, Joshua & McKinnie, Elizabeth & Rumezhak, Taras & Stadnik, Andrii & Howe, Bill & Stoyanovich, Julia, 2023. "Epistemic parity: reproducibility as an evaluation metric for differential privacy," LSE Research Online Documents on Economics 120493, London School of Economics and Political Science, LSE Library.
    17. Inga Patarčić & Jadranka Stojanovski, 2022. "Adoption of Transparency and Openness Promotion (TOP) Guidelines across Journals," Publications, MDPI, vol. 10(4), pages 1-10, November.
    18. Susanne Wieschowski & Svenja Biernot & Susanne Deutsch & Silke Glage & André Bleich & René Tolba & Daniel Strech, 2019. "Publication rates in animal research. Extent and characteristics of published and non-published animal studies followed up at two German university medical centres," PLOS ONE, Public Library of Science, vol. 14(11), pages 1-8, November.
    19. Ignacio, Escañuela Romana, 2019. "The elasticities of passenger transport demand in the Northeast Corridor," Research in Transportation Economics, Elsevier, vol. 78(C).
    20. Muradchanian, Jasmine & Hoekstra, Rink & Kiers, Henk & van Ravenzwaaij, Don, 2020. "How Best to Quantify Replication Success? A Simulation Study on the Comparison of Replication Success Metrics," MetaArXiv wvdjf, Center for Open Science.

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:sae:jedbes:v:44:y:2019:i:5:p:543-570. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: SAGE Publications (email available below). General contact details of provider: .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.