IDEAS home Printed from https://ideas.repec.org/a/plo/pbio00/2001307.html
   My bibliography  Save this article

Increasing efficiency of preclinical research by group sequential designs

Author

Listed:
  • Konrad Neumann
  • Ulrike Grittner
  • Sophie K Piper
  • Andre Rex
  • Oscar Florez-Vargas
  • George Karystianis
  • Alice Schneider
  • Ian Wellwood
  • Bob Siegerink
  • John P A Ioannidis
  • Jonathan Kimmelman
  • Ulrich Dirnagl

Abstract

Despite the potential benefits of sequential designs, studies evaluating treatments or experimental manipulations in preclinical experimental biomedicine almost exclusively use classical block designs. Our aim with this article is to bring the existing methodology of group sequential designs to the attention of researchers in the preclinical field and to clearly illustrate its potential utility. Group sequential designs can offer higher efficiency than traditional methods and are increasingly used in clinical trials. Using simulation of data, we demonstrate that group sequential designs have the potential to improve the efficiency of experimental studies, even when sample sizes are very small, as is currently prevalent in preclinical experimental biomedicine. When simulating data with a large effect size of d = 1 and a sample size of n = 18 per group, sequential frequentist analysis consumes in the long run only around 80% of the planned number of experimental units. In larger trials (n = 36 per group), additional stopping rules for futility lead to the saving of resources of up to 30% compared to block designs. We argue that these savings should be invested to increase sample sizes and hence power, since the currently underpowered experiments in preclinical biomedicine are a major threat to the value and predictiveness in this research domain.

Suggested Citation

  • Konrad Neumann & Ulrike Grittner & Sophie K Piper & Andre Rex & Oscar Florez-Vargas & George Karystianis & Alice Schneider & Ian Wellwood & Bob Siegerink & John P A Ioannidis & Jonathan Kimmelman & Ul, 2017. "Increasing efficiency of preclinical research by group sequential designs," PLOS Biology, Public Library of Science, vol. 15(3), pages 1-9, March.
  • Handle: RePEc:plo:pbio00:2001307
    DOI: 10.1371/journal.pbio.2001307
    as

    Download full text from publisher

    File URL: https://journals.plos.org/plosbiology/article?id=10.1371/journal.pbio.2001307
    Download Restriction: no

    File URL: https://journals.plos.org/plosbiology/article/file?id=10.1371/journal.pbio.2001307&type=printable
    Download Restriction: no

    File URL: https://libkey.io/10.1371/journal.pbio.2001307?utm_source=ideas
    LibKey link: if access is restricted and if your library uses this service, LibKey will redirect you to where you can use your library subscription to access this item
    ---><---

    References listed on IDEAS

    as
    1. Carol Kilkenny & Nick Parsons & Ed Kadyszewski & Michael F W Festing & Innes C Cuthill & Derek Fry & Jane Hutton & Douglas G Altman, 2009. "Survey of the Quality of Experimental Design, Statistical Analysis and Reporting of Research Using Animals," PLOS ONE, Public Library of Science, vol. 4(11), pages 1-11, November.
    2. Daniel Cressey, 2015. "UK funders demand strong statistics for animal studies," Nature, Nature, vol. 520(7547), pages 271-272, April.
    3. Megan L Head & Luke Holman & Rob Lanfear & Andrew T Kahn & Michael D Jennions, 2015. "The Extent and Consequences of P-Hacking in Science," PLOS Biology, Public Library of Science, vol. 13(3), pages 1-15, March.
    Full references (including those not matched with items on IDEAS)

    Citations

    Citations are extracted by the CitEc Project, subscribe to its RSS feed for this item.
    as


    Cited by:

    1. Sophie K Piper & Ulrike Grittner & Andre Rex & Nico Riedel & Felix Fischer & Robert Nadon & Bob Siegerink & Ulrich Dirnagl, 2019. "Exact replication: Foundation of science or game of chance?," PLOS Biology, Public Library of Science, vol. 17(4), pages 1-9, April.

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Dean A Fergusson & Marc T Avey & Carly C Barron & Mathew Bocock & Kristen E Biefer & Sylvain Boet & Stephane L Bourque & Isidora Conic & Kai Chen & Yuan Yi Dong & Grace M Fox & Ronald B George & Neil , 2019. "Reporting preclinical anesthesia study (REPEAT): Evaluating the quality of reporting in the preclinical anesthesiology literature," PLOS ONE, Public Library of Science, vol. 14(5), pages 1-15, May.
    2. Abel Brodeur, Nikolai M. Cook, Anthony Heyes, 2022. "We Need to Talk about Mechanical Turk: What 22,989 Hypothesis Tests Tell Us about Publication Bias and p-Hacking in Online Experiments," LCERPA Working Papers am0133, Laurier Centre for Economic Research and Policy Analysis.
    3. Jasper Brinkerink, 2023. "When Shooting for the Stars Becomes Aiming for Asterisks: P-Hacking in Family Business Research," Entrepreneurship Theory and Practice, , vol. 47(2), pages 304-343, March.
    4. Arnaud Vaganay, 2016. "Cluster Sampling Bias in Government-Sponsored Evaluations: A Correlational Study of Employment and Welfare Pilots in England," PLOS ONE, Public Library of Science, vol. 11(8), pages 1-21, August.
    5. Rosa Lavelle-Hill & Gavin Smith & Anjali Mazumder & Todd Landman & James Goulding, 2021. "Machine learning methods for “wicked” problems: exploring the complex drivers of modern slavery," Palgrave Communications, Palgrave Macmillan, vol. 8(1), pages 1-11, December.
    6. David Winkelmann & Marius Ötting & Christian Deutscher & Tomasz Makarewicz, 2024. "Are Betting Markets Inefficient? Evidence From Simulations and Real Data," Journal of Sports Economics, , vol. 25(1), pages 54-97, January.
    7. Rinne, Sonja, 2024. "Estimating the merit-order effect using coarsened exact matching: Reconciling theory with the empirical results to improve policy implications," Energy Policy, Elsevier, vol. 185(C).
    8. Graham Elliott & Nikolay Kudrin & Kaspar Wüthrich, 2022. "Detecting p‐Hacking," Econometrica, Econometric Society, vol. 90(2), pages 887-906, March.
    9. Stephan B Bruns & John P A Ioannidis, 2016. "p-Curve and p-Hacking in Observational Research," PLOS ONE, Public Library of Science, vol. 11(2), pages 1-13, February.
    10. Miguel Baiao & Ilze Buligina, 2021. "Work Experience Led Programs and Employment Attainment," International Journal of Economics & Business Administration (IJEBA), International Journal of Economics & Business Administration (IJEBA), vol. 0(1), pages 180-198.
    11. Vivian Leung & Frédérik Rousseau-Blass & Guy Beauchamp & Daniel S J Pang, 2018. "ARRIVE has not ARRIVEd: Support for the ARRIVE (Animal Research: Reporting of in vivo Experiments) guidelines does not improve the reporting quality of papers in animal welfare, analgesia or anesthesi," PLOS ONE, Public Library of Science, vol. 13(5), pages 1-13, May.
    12. Brodeur, Abel & Cook, Nikolai & Heyes, Anthony, 2022. "We Need to Talk about Mechanical Turk: What 22,989 Hypothesis Tests Tell us about p-Hacking and Publication Bias in Online Experiments," GLO Discussion Paper Series 1157, Global Labor Organization (GLO).
    13. Julia Roloff & Michael J. Zyphur, 2019. "Null Findings, Replications and Preregistered Studies in Business Ethics Research," Journal of Business Ethics, Springer, vol. 160(3), pages 609-619, December.
    14. Ingmar Böschen, 2021. "Software review: The JATSdecoder package—extract metadata, abstract and sectioned text from NISO-JATS coded XML documents; Insights to PubMed central’s open access database," Scientometrics, Springer;Akadémiai Kiadó, vol. 126(12), pages 9585-9601, December.
    15. Beverly S Muhlhausler & Frank H Bloomfield & Matthew W Gillman, 2013. "Whole Animal Experiments Should Be More Like Human Randomized Controlled Trials," PLOS Biology, Public Library of Science, vol. 11(2), pages 1-6, February.
    16. Freuli, Francesca & Held, Leonhard & Heyard, Rachel, 2022. "Replication Success under Questionable Research Practices - A Simulation Study," I4R Discussion Paper Series 2, The Institute for Replication (I4R).
    17. Graham Elliott & Nikolay Kudrin & Kaspar Wuthrich, 2022. "The Power of Tests for Detecting $p$-Hacking," Papers 2205.07950, arXiv.org, revised Apr 2024.
    18. Marko Kovic & Nina Hänsli, 2017. "The Impact of Political Cleavages, Religiosity, and Values on Attitudes towards Nonprofit Organizations," Social Sciences, MDPI, vol. 7(1), pages 1-18, December.
    19. Martin E Héroux & Janet L Taylor & Simon C Gandevia, 2015. "The Use and Abuse of Transcranial Magnetic Stimulation to Modulate Corticospinal Excitability in Humans," PLOS ONE, Public Library of Science, vol. 10(12), pages 1-10, December.
    20. Pierre J C Chuard & Milan Vrtílek & Megan L Head & Michael D Jennions, 2019. "Evidence that nonsignificant results are sometimes preferred: Reverse P-hacking or selective reporting?," PLOS Biology, Public Library of Science, vol. 17(1), pages 1-7, January.

    More about this item

    Statistics

    Access and download statistics

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:plo:pbio00:2001307. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: plosbiology (email available below). General contact details of provider: https://journals.plos.org/plosbiology/ .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.