IDEAS home Printed from https://ideas.repec.org/a/plo/pbio00/2001307.html
   My bibliography  Save this article

Increasing efficiency of preclinical research by group sequential designs

Author

Listed:
  • Konrad Neumann
  • Ulrike Grittner
  • Sophie K Piper
  • Andre Rex
  • Oscar Florez-Vargas
  • George Karystianis
  • Alice Schneider
  • Ian Wellwood
  • Bob Siegerink
  • John P A Ioannidis
  • Jonathan Kimmelman
  • Ulrich Dirnagl

Abstract

Despite the potential benefits of sequential designs, studies evaluating treatments or experimental manipulations in preclinical experimental biomedicine almost exclusively use classical block designs. Our aim with this article is to bring the existing methodology of group sequential designs to the attention of researchers in the preclinical field and to clearly illustrate its potential utility. Group sequential designs can offer higher efficiency than traditional methods and are increasingly used in clinical trials. Using simulation of data, we demonstrate that group sequential designs have the potential to improve the efficiency of experimental studies, even when sample sizes are very small, as is currently prevalent in preclinical experimental biomedicine. When simulating data with a large effect size of d = 1 and a sample size of n = 18 per group, sequential frequentist analysis consumes in the long run only around 80% of the planned number of experimental units. In larger trials (n = 36 per group), additional stopping rules for futility lead to the saving of resources of up to 30% compared to block designs. We argue that these savings should be invested to increase sample sizes and hence power, since the currently underpowered experiments in preclinical biomedicine are a major threat to the value and predictiveness in this research domain.

Suggested Citation

  • Konrad Neumann & Ulrike Grittner & Sophie K Piper & Andre Rex & Oscar Florez-Vargas & George Karystianis & Alice Schneider & Ian Wellwood & Bob Siegerink & John P A Ioannidis & Jonathan Kimmelman & Ul, 2017. "Increasing efficiency of preclinical research by group sequential designs," PLOS Biology, Public Library of Science, vol. 15(3), pages 1-9, March.
  • Handle: RePEc:plo:pbio00:2001307
    DOI: 10.1371/journal.pbio.2001307
    as

    Download full text from publisher

    File URL: https://journals.plos.org/plosbiology/article?id=10.1371/journal.pbio.2001307
    Download Restriction: no

    File URL: https://journals.plos.org/plosbiology/article/file?id=10.1371/journal.pbio.2001307&type=printable
    Download Restriction: no

    File URL: https://libkey.io/10.1371/journal.pbio.2001307?utm_source=ideas
    LibKey link: if access is restricted and if your library uses this service, LibKey will redirect you to where you can use your library subscription to access this item
    ---><---

    References listed on IDEAS

    as
    1. Carol Kilkenny & Nick Parsons & Ed Kadyszewski & Michael F W Festing & Innes C Cuthill & Derek Fry & Jane Hutton & Douglas G Altman, 2009. "Survey of the Quality of Experimental Design, Statistical Analysis and Reporting of Research Using Animals," PLOS ONE, Public Library of Science, vol. 4(11), pages 1-11, November.
    2. Megan L Head & Luke Holman & Rob Lanfear & Andrew T Kahn & Michael D Jennions, 2015. "The Extent and Consequences of P-Hacking in Science," PLOS Biology, Public Library of Science, vol. 13(3), pages 1-15, March.
    3. Daniel Cressey, 2015. "UK funders demand strong statistics for animal studies," Nature, Nature, vol. 520(7547), pages 271-272, April.
    Full references (including those not matched with items on IDEAS)

    Citations

    Citations are extracted by the CitEc Project, subscribe to its RSS feed for this item.
    as


    Cited by:

    1. Sophie K Piper & Ulrike Grittner & Andre Rex & Nico Riedel & Felix Fischer & Robert Nadon & Bob Siegerink & Ulrich Dirnagl, 2019. "Exact replication: Foundation of science or game of chance?," PLOS Biology, Public Library of Science, vol. 17(4), pages 1-9, April.

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Dean A Fergusson & Marc T Avey & Carly C Barron & Mathew Bocock & Kristen E Biefer & Sylvain Boet & Stephane L Bourque & Isidora Conic & Kai Chen & Yuan Yi Dong & Grace M Fox & Ronald B George & Neil , 2019. "Reporting preclinical anesthesia study (REPEAT): Evaluating the quality of reporting in the preclinical anesthesiology literature," PLOS ONE, Public Library of Science, vol. 14(5), pages 1-15, May.
    2. Jasper Brinkerink, 2023. "When Shooting for the Stars Becomes Aiming for Asterisks: P-Hacking in Family Business Research," Entrepreneurship Theory and Practice, , vol. 47(2), pages 304-343, March.
    3. Freuli, Francesca & Held, Leonhard & Heyard, Rachel, 2022. "Replication success under questionable research practices – a simulation study," MetaArXiv s4b65_v1, Center for Open Science.
    4. Arnaud Vaganay, 2016. "Cluster Sampling Bias in Government-Sponsored Evaluations: A Correlational Study of Employment and Welfare Pilots in England," PLOS ONE, Public Library of Science, vol. 11(8), pages 1-21, August.
    5. David Winkelmann & Marius Ötting & Christian Deutscher & Tomasz Makarewicz, 2024. "Are Betting Markets Inefficient? Evidence From Simulations and Real Data," Journal of Sports Economics, , vol. 25(1), pages 54-97, January.
    6. Graham Elliott & Nikolay Kudrin & Kaspar Wüthrich, 2022. "Detecting p‐Hacking," Econometrica, Econometric Society, vol. 90(2), pages 887-906, March.
    7. Vivian Leung & Frédérik Rousseau-Blass & Guy Beauchamp & Daniel S J Pang, 2018. "ARRIVE has not ARRIVEd: Support for the ARRIVE (Animal Research: Reporting of in vivo Experiments) guidelines does not improve the reporting quality of papers in animal welfare, analgesia or anesthesi," PLOS ONE, Public Library of Science, vol. 13(5), pages 1-13, May.
    8. Brodeur, Abel & Cook, Nikolai & Heyes, Anthony, 2022. "We Need to Talk about Mechanical Turk: What 22,989 Hypothesis Tests Tell us about p-Hacking and Publication Bias in Online Experiments," GLO Discussion Paper Series 1157, Global Labor Organization (GLO).
    9. Beverly S Muhlhausler & Frank H Bloomfield & Matthew W Gillman, 2013. "Whole Animal Experiments Should Be More Like Human Randomized Controlled Trials," PLOS Biology, Public Library of Science, vol. 11(2), pages 1-6, February.
    10. Graham Elliott & Nikolay Kudrin & Kaspar Wuthrich, 2022. "The Power of Tests for Detecting $p$-Hacking," Papers 2205.07950, arXiv.org, revised Apr 2024.
    11. Martin E Héroux & Janet L Taylor & Simon C Gandevia, 2015. "The Use and Abuse of Transcranial Magnetic Stimulation to Modulate Corticospinal Excitability in Humans," PLOS ONE, Public Library of Science, vol. 10(12), pages 1-10, December.
    12. Pierre J C Chuard & Milan Vrtílek & Megan L Head & Michael D Jennions, 2019. "Evidence that nonsignificant results are sometimes preferred: Reverse P-hacking or selective reporting?," PLOS Biology, Public Library of Science, vol. 17(1), pages 1-7, January.
    13. Tracey L Weissgerber, 2021. "Learning from the past to develop data analysis curricula for the future," PLOS Biology, Public Library of Science, vol. 19(7), pages 1-3, July.
    14. van Aert, Robbie Cornelis Maria & van Assen, Marcel A. L. M., 2018. "P-uniform," MetaArXiv zqjr9, Center for Open Science.
    15. Nicky Agate & Rebecca Kennison & Stacy Konkiel & Christopher P. Long & Jason Rhody & Simone Sacchi & Penelope Weber, 2020. "The transformative power of values-enacted scholarship," Palgrave Communications, Palgrave Macmillan, vol. 7(1), pages 1-12, December.
    16. Feuz, Ryan, 2023. "Hedonic Price Analysis of Used Tractors," Applied Economics Teaching Resources (AETR), Agricultural and Applied Economics Association, vol. 5(01), January.
    17. Adriano Koshiyama & Nick Firoozye, 2019. "Avoiding Backtesting Overfitting by Covariance-Penalties: an empirical investigation of the ordinary and total least squares cases," Papers 1905.05023, arXiv.org.
    18. Augusteijn, Hilde & van Aert, Robbie Cornelis Maria & van Assen, Marcel A. L. M., 2017. "The Effect of Publication Bias on the Assessment of Heterogeneity," OSF Preprints gv25c, Center for Open Science.
    19. Guillaume Coqueret, 2023. "Forking paths in financial economics," Papers 2401.08606, arXiv.org.
    20. Mattia Prosperi & Jiang Bian & Iain E. Buchan & James S. Koopman & Matthew Sperrin & Mo Wang, 2019. "Raiders of the lost HARK: a reproducible inference framework for big data science," Palgrave Communications, Palgrave Macmillan, vol. 5(1), pages 1-12, December.

    More about this item

    Statistics

    Access and download statistics

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:plo:pbio00:2001307. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: plosbiology (email available below). General contact details of provider: https://journals.plos.org/plosbiology/ .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.