IDEAS home Printed from https://ideas.repec.org/a/bla/jorssa/v184y2021i3p868-886.html
   My bibliography  Save this article

The design of replication studies

Author

Listed:
  • Larry V. Hedges
  • Jacob M. Schauer

Abstract

Empirical evaluations of replication have become increasingly common, but there has been no unified approach to doing so. Some evaluations conduct only a single replication study while others run several, usually across multiple laboratories. Designing such programs has largely contended with difficult issues about which experimental components are necessary for a set of studies to be considered replications. However, another important consideration is that replication studies be designed to support sufficiently sensitive analyses. For instance, if hypothesis tests are to be conducted about replication, studies should be designed to ensure these tests are well‐powered; if not, it can be difficult to determine conclusively if replication attempts succeeded or failed. This paper describes methods for designing ensembles of replication studies to ensure that they are both adequately sensitive and cost‐efficient. It describes two potential analyses of replication studies—hypothesis tests and variance component estimation—and approaches to obtaining optimal designs for them. Using these results, it assesses the statistical power, precision of point estimators and optimality of the design used by the Many Labs Project and finds that while it may have been sufficiently powered to detect some larger differences between studies, other designs would have been less costly and/or produced more precise estimates or higher‐powered hypothesis tests.

Suggested Citation

  • Larry V. Hedges & Jacob M. Schauer, 2021. "The design of replication studies," Journal of the Royal Statistical Society Series A, Royal Statistical Society, vol. 184(3), pages 868-886, July.
  • Handle: RePEc:bla:jorssa:v:184:y:2021:i:3:p:868-886
    DOI: 10.1111/rssa.12688
    as

    Download full text from publisher

    File URL: https://doi.org/10.1111/rssa.12688
    Download Restriction: no

    File URL: https://libkey.io/10.1111/rssa.12688?utm_source=ideas
    LibKey link: if access is restricted and if your library uses this service, LibKey will redirect you to where you can use your library subscription to access this item
    ---><---

    References listed on IDEAS

    as
    1. Alexander Etz & Joachim Vandekerckhove, 2016. "A Bayesian Perspective on the Reproducibility Project: Psychology," PLOS ONE, Public Library of Science, vol. 11(2), pages 1-12, February.
    2. Larry V. Hedges & Jacob M. Schauer, 2019. "More Than One Replication Study Is Needed for Unambiguous Tests of Replication," Journal of Educational and Behavioral Statistics, , vol. 44(5), pages 543-570, October.
    3. Steve Perrin, 2014. "Preclinical research: Make mouse studies work," Nature, Nature, vol. 507(7493), pages 423-425, March.
    4. repec:cup:judgdm:v:4:y:2009:i:5:p:326-334 is not listed on IDEAS
    5. Francis S. Collins & Lawrence A. Tabak, 2014. "Policy: NIH plans to enhance reproducibility," Nature, Nature, vol. 505(7485), pages 612-613, January.
    6. Camerer, Colin & Dreber, Anna & Forsell, Eskil & Ho, Teck-Hua & Huber, Jurgen & Johannesson, Magnus & Kirchler, Michael & Almenberg, Johan & Altmejd, Adam & Chan, Taizan & Heikensten, Emma & Holzmeist, 2016. "Evaluating replicability of laboratory experiments in Economics," MPRA Paper 75461, University Library of Munich, Germany.
    Full references (including those not matched with items on IDEAS)

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Samuel Pawel & Leonhard Held, 2022. "The sceptical Bayes factor for the assessment of replication success," Journal of the Royal Statistical Society Series B, Royal Statistical Society, vol. 84(3), pages 879-911, July.
    2. Larry V. Hedges & Jacob M. Schauer, 2019. "More Than One Replication Study Is Needed for Unambiguous Tests of Replication," Journal of Educational and Behavioral Statistics, , vol. 44(5), pages 543-570, October.
    3. Aaron C Ericsson & J Wade Davis & William Spollen & Nathan Bivens & Scott Givan & Catherine E Hagan & Mark McIntosh & Craig L Franklin, 2015. "Effects of Vendor and Genetic Background on the Composition of the Fecal Microbiota of Inbred Mice," PLOS ONE, Public Library of Science, vol. 10(2), pages 1-19, February.
    4. Colin F. Camerer & Anna Dreber & Felix Holzmeister & Teck-Hua Ho & Jürgen Huber & Magnus Johannesson & Michael Kirchler & Gideon Nave & Brian A. Nosek & Thomas Pfeiffer & Adam Altmejd & Nick Buttrick , 2018. "Evaluating the replicability of social science experiments in Nature and Science between 2010 and 2015," Nature Human Behaviour, Nature, vol. 2(9), pages 637-644, September.
    5. Fanelli, Daniele, 2020. "Metascientific reproducibility patterns revealed by informatic measure of knowledge," MetaArXiv 5vnhj, Center for Open Science.
    6. Mathur, Maya B & VanderWeele, Tyler, 2018. "Statistical methods for evidence synthesis," Thesis Commons kd6ja, Center for Open Science.
    7. Fanelli, Daniele, 2022. "The "Tau" of Science - How to Measure, Study, and Integrate Quantitative and Qualitative Knowledge," MetaArXiv 67sak, Center for Open Science.
    8. Robbie C M van Aert & Marcel A L M van Assen, 2017. "Bayesian evaluation of effect size after replicating an original study," PLOS ONE, Public Library of Science, vol. 12(4), pages 1-23, April.
    9. Eric Buren & David Azzara & Javier Rangel-Moreno & Maria de la Luz Garcia-Hernandez & Shawn P. Murphy & Ethan D. Cohen & Ethan Lewis & Xihong Lin & Hae-Ryung Park, 2024. "Single-cell RNA sequencing reveals placental response under environmental stress," Nature Communications, Nature, vol. 15(1), pages 1-18, December.
    10. Alexander Frankel & Maximilian Kasy, 2022. "Which Findings Should Be Published?," American Economic Journal: Microeconomics, American Economic Association, vol. 14(1), pages 1-38, February.
    11. Dean A Fergusson & Marc T Avey & Carly C Barron & Mathew Bocock & Kristen E Biefer & Sylvain Boet & Stephane L Bourque & Isidora Conic & Kai Chen & Yuan Yi Dong & Grace M Fox & Ronald B George & Neil , 2019. "Reporting preclinical anesthesia study (REPEAT): Evaluating the quality of reporting in the preclinical anesthesiology literature," PLOS ONE, Public Library of Science, vol. 14(5), pages 1-15, May.
    12. Omar Al-Ubaydli & John List & Claire Mackevicius & Min Sok Lee & Dana Suskind, 2019. "How Can Experiments Play a Greater Role in Public Policy? 12 Proposals from an Economic Model of Scaling," Artefactual Field Experiments 00679, The Field Experiments Website.
    13. Tom Coupé & W. Robert Reed, 2021. "Do Negative Replications Affect Citations?," Working Papers in Economics 21/14, University of Canterbury, Department of Economics and Finance.
    14. Mueller-Langer, Frank & Andreoli-Versbach, Patrick, 2018. "Open access to research data: Strategic delay and the ambiguous welfare effects of mandatory data disclosure," Information Economics and Policy, Elsevier, vol. 42(C), pages 20-34.
    15. Jindrich Matousek & Tomas Havranek & Zuzana Irsova, 2022. "Individual discount rates: a meta-analysis of experimental evidence," Experimental Economics, Springer;Economic Science Association, vol. 25(1), pages 318-358, February.
    16. Nick Huntington‐Klein & Andreu Arenas & Emily Beam & Marco Bertoni & Jeffrey R. Bloem & Pralhad Burli & Naibin Chen & Paul Grieco & Godwin Ekpe & Todd Pugatch & Martin Saavedra & Yaniv Stopnitzky, 2021. "The influence of hidden researcher decisions in applied microeconomics," Economic Inquiry, Western Economic Association International, vol. 59(3), pages 944-960, July.
    17. Galanis, S. & Ioannou, C. & Kotronis, S., 2019. "Information Aggregation Under Ambiguity: Theory and Experimental Evidence," Working Papers 20/05, Department of Economics, City University London.
    18. Valentine, Kathrene D & Buchanan, Erin Michelle & Scofield, John E. & Beauchamp, Marshall T., 2017. "Beyond p-values: Utilizing Multiple Estimates to Evaluate Evidence," OSF Preprints 9hp7y, Center for Open Science.
    19. Cloos, Janis & Greiff, Matthias & Rusch, Hannes, 2020. "Geographical Concentration and Editorial Favoritism within the Field of Laboratory Experimental Economics (RM/19/029-revised-)," Research Memorandum 014, Maastricht University, Graduate School of Business and Economics (GSBE).
    20. Doucouliagos, Hristos & Paldam, Martin & Stanley, T.D., 2018. "Skating on thin evidence: Implications for public policy," European Journal of Political Economy, Elsevier, vol. 54(C), pages 16-25.

    More about this item

    Statistics

    Access and download statistics

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:bla:jorssa:v:184:y:2021:i:3:p:868-886. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: Wiley Content Delivery (email available below). General contact details of provider: https://edirc.repec.org/data/rssssea.html .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.