IDEAS home Printed from https://ideas.repec.org/a/plo/pbio00/2000598.html
   My bibliography  Save this article

Authorization of Animal Experiments Is Based on Confidence Rather than Evidence of Scientific Rigor

Author

Listed:
  • Lucile Vogt
  • Thomas S Reichlin
  • Christina Nathues
  • Hanno Würbel

Abstract

Accumulating evidence indicates high risk of bias in preclinical animal research, questioning the scientific validity and reproducibility of published research findings. Systematic reviews found low rates of reporting of measures against risks of bias in the published literature (e.g., randomization, blinding, sample size calculation) and a correlation between low reporting rates and inflated treatment effects. That most animal research undergoes peer review or ethical review would offer the possibility to detect risks of bias at an earlier stage, before the research has been conducted. For example, in Switzerland, animal experiments are licensed based on a detailed description of the study protocol and a harm–benefit analysis. We therefore screened applications for animal experiments submitted to Swiss authorities (n = 1,277) for the rates at which the use of seven basic measures against bias (allocation concealment, blinding, randomization, sample size calculation, inclusion/exclusion criteria, primary outcome variable, and statistical analysis plan) were described and compared them with the reporting rates of the same measures in a representative sub-sample of publications (n = 50) resulting from studies described in these applications. Measures against bias were described at very low rates, ranging on average from 2.4% for statistical analysis plan to 19% for primary outcome variable in applications for animal experiments, and from 0.0% for sample size calculation to 34% for statistical analysis plan in publications from these experiments. Calculating an internal validity score (IVS) based on the proportion of the seven measures against bias, we found a weak positive correlation between the IVS of applications and that of publications (Spearman’s rho = 0.34, p = 0.014), indicating that the rates of description of these measures in applications partly predict their rates of reporting in publications. These results indicate that the authorities licensing animal experiments are lacking important information about experimental conduct that determines the scientific validity of the findings, which may be critical for the weight attributed to the benefit of the research in the harm–benefit analysis. Similar to manuscripts getting accepted for publication despite poor reporting of measures against bias, applications for animal experiments may often be approved based on implicit confidence rather than explicit evidence of scientific rigor. Our findings shed serious doubt on the current authorization procedure for animal experiments, as well as the peer-review process for scientific publications, which in the long run may undermine the credibility of research. Developing existing authorization procedures that are already in place in many countries towards a preregistration system for animal research is one promising way to reform the system. This would not only benefit the scientific validity of findings from animal experiments but also help to avoid unnecessary harm to animals for inconclusive research.Author Summary: Scientific validity of research findings depends on scientific rigor, including measures to avoid bias, such as random allocation of animals to treatment groups (randomization) and assessing outcome measures without knowing to which treatment groups the animals belong (blinding). However, measures against bias are rarely reported in publications, and systematic reviews found that poor reporting was associated with larger treatment effects, suggesting bias. Here we studied whether risk of bias could be predicted from study protocols submitted for ethical review. We assessed mention of seven basic measures against bias in study protocols submitted for approval in Switzerland and in publications resulting from these studies. Measures against bias were mentioned at very low rates both in study protocols (2%–19%) and in publications (0%–34%). However, we found a weak positive correlation, indicating that the rates at which measures against bias were mentioned in study protocols predicted the rates at which they were reported in publications. Our results indicate that animal experiments are often licensed based on confidence rather than evidence of scientific rigor, which may compromise scientific validity and induce unnecessary harm to animals caused by inconclusive research.

Suggested Citation

  • Lucile Vogt & Thomas S Reichlin & Christina Nathues & Hanno Würbel, 2016. "Authorization of Animal Experiments Is Based on Confidence Rather than Evidence of Scientific Rigor," PLOS Biology, Public Library of Science, vol. 14(12), pages 1-24, December.
  • Handle: RePEc:plo:pbio00:2000598
    DOI: 10.1371/journal.pbio.2000598
    as

    Download full text from publisher

    File URL: https://journals.plos.org/plosbiology/article?id=10.1371/journal.pbio.2000598
    Download Restriction: no

    File URL: https://journals.plos.org/plosbiology/article/file?id=10.1371/journal.pbio.2000598&type=printable
    Download Restriction: no

    File URL: https://libkey.io/10.1371/journal.pbio.2000598?utm_source=ideas
    LibKey link: if access is restricted and if your library uses this service, LibKey will redirect you to where you can use your library subscription to access this item
    ---><---

    References listed on IDEAS

    as
    1. H Bart van der Worp & David W Howells & Emily S Sena & Michelle J Porritt & Sarah Rewell & Victoria O'Collins & Malcolm R Macleod, 2010. "Can Animal Models of Disease Reliably Inform Human Studies?," PLOS Medicine, Public Library of Science, vol. 7(3), pages 1-8, March.
    2. Judith van Luijk & Brenda Bakker & Maroeska M Rovers & Merel Ritskes-Hoitinga & Rob B M de Vries & Marlies Leenaars, 2014. "Systematic Reviews of Animal Studies; Missing Link in Translational Research?," PLOS ONE, Public Library of Science, vol. 9(3), pages 1-5, March.
    3. Emily S Sena & H Bart van der Worp & Philip M W Bath & David W Howells & Malcolm R Macleod, 2010. "Publication Bias in Reports of Animal Stroke Studies Leads to Major Overstatement of Efficacy," PLOS Biology, Public Library of Science, vol. 8(3), pages 1-8, March.
    4. Malcolm R Macleod & Aaron Lawson McLean & Aikaterini Kyriakopoulou & Stylianos Serghiou & Arno de Wilde & Nicki Sherratt & Theo Hirst & Rachel Hemblade & Zsanett Bahor & Cristina Nunes-Fonseca & Aparn, 2015. "Risk of Bias in Reports of In Vivo Research: A Focus for Improvement," PLOS Biology, Public Library of Science, vol. 13(10), pages 1-12, October.
    5. Carol Kilkenny & William J Browne & Innes C Cuthill & Michael Emerson & Douglas G Altman, 2010. "Improving Bioscience Research Reporting: The ARRIVE Guidelines for Reporting Animal Research," PLOS Biology, Public Library of Science, vol. 8(6), pages 1-5, June.
    6. David Krauth & Andrew Anglemyer & Rose Philipps & Lisa Bero, 2014. "Nonindustry-Sponsored Preclinical Studies on Statins Yield Greater Efficacy Estimates Than Industry-Sponsored Studies: A Meta-Analysis," PLOS Biology, Public Library of Science, vol. 12(1), pages 1-10, January.
    7. Valerie C Henderson & Jonathan Kimmelman & Dean Fergusson & Jeremy M Grimshaw & Dan G Hackam, 2013. "Threats to Validity in the Design and Conduct of Preclinical Efficacy Studies: A Systematic Review of Guidelines for In Vivo Animal Experiments," PLOS Medicine, Public Library of Science, vol. 10(7), pages 1-14, July.
    8. Malcolm Macleod, 2014. "Some Salt with Your Statin, Professor?," PLOS Biology, Public Library of Science, vol. 12(1), pages 1-3, January.
    Full references (including those not matched with items on IDEAS)

    Citations

    Citations are extracted by the CitEc Project, subscribe to its RSS feed for this item.
    as


    Cited by:

    1. Pandora Pound & Christine J Nicol, 2018. "Retrospective harm benefit analysis of pre-clinical animal research for six treatment interventions," PLOS ONE, Public Library of Science, vol. 13(3), pages 1-26, March.

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Constance Holman & Sophie K Piper & Ulrike Grittner & Andreas Antonios Diamantaras & Jonathan Kimmelman & Bob Siegerink & Ulrich Dirnagl, 2016. "Where Have All the Rodents Gone? The Effects of Attrition in Experimental Research on Cancer and Stroke," PLOS Biology, Public Library of Science, vol. 14(1), pages 1-12, January.
    2. Jonathan Kimmelman & Alex John London, 2011. "Predicting Harms and Benefits in Translational Trials: Ethics, Evidence, and Uncertainty," PLOS Medicine, Public Library of Science, vol. 8(3), pages 1-5, March.
    3. Malcolm R Macleod & Aaron Lawson McLean & Aikaterini Kyriakopoulou & Stylianos Serghiou & Arno de Wilde & Nicki Sherratt & Theo Hirst & Rachel Hemblade & Zsanett Bahor & Cristina Nunes-Fonseca & Aparn, 2015. "Risk of Bias in Reports of In Vivo Research: A Focus for Improvement," PLOS Biology, Public Library of Science, vol. 13(10), pages 1-12, October.
    4. Catriona J MacCallum, 2010. "Reporting Animal Studies: Good Science and a Duty of Care," PLOS Biology, Public Library of Science, vol. 8(6), pages 1-2, June.
    5. Valerie C Henderson & Jonathan Kimmelman & Dean Fergusson & Jeremy M Grimshaw & Dan G Hackam, 2013. "Threats to Validity in the Design and Conduct of Preclinical Efficacy Studies: A Systematic Review of Guidelines for In Vivo Animal Experiments," PLOS Medicine, Public Library of Science, vol. 10(7), pages 1-14, July.
    6. Konstantinos K Tsilidis & Orestis A Panagiotou & Emily S Sena & Eleni Aretouli & Evangelos Evangelou & David W Howells & Rustam Al-Shahi Salman & Malcolm R Macleod & John P A Ioannidis, 2013. "Evaluation of Excess Significance Bias in Animal Studies of Neurological Diseases," PLOS Biology, Public Library of Science, vol. 11(7), pages 1-10, July.
    7. Jonathan A Eisen & Emma Ganley & Catriona J MacCallum, 2014. "Open Science and Reporting Animal Studies: Who's Accountable?," PLOS Biology, Public Library of Science, vol. 12(1), pages 1-3, January.
    8. Jonathan Kimmelman & Jeffrey S Mogil & Ulrich Dirnagl, 2014. "Distinguishing between Exploratory and Confirmatory Preclinical Research Will Improve Translation," PLOS Biology, Public Library of Science, vol. 12(5), pages 1-4, May.
    9. Jason A Miranda & Phil Stanley & Katrina Gore & Jamie Turner & Rebecca Dias & Huw Rees, 2014. "A Preclinical Physiological Assay to Test Modulation of Knee Joint Pain in the Spinal Cord: Effects of Oxycodone and Naproxen," PLOS ONE, Public Library of Science, vol. 9(8), pages 1-11, August.
    10. Pandora Pound & Christine J Nicol, 2018. "Retrospective harm benefit analysis of pre-clinical animal research for six treatment interventions," PLOS ONE, Public Library of Science, vol. 13(3), pages 1-26, March.
    11. Kimberley E Wever & Carlijn R Hooijmans & Niels P Riksen & Thomas B Sterenborg & Emily S Sena & Merel Ritskes-Hoitinga & Michiel C Warlé, 2015. "Determinants of the Efficacy of Cardiac Ischemic Preconditioning: A Systematic Review and Meta-Analysis of Animal Studies," PLOS ONE, Public Library of Science, vol. 10(11), pages 1-17, November.
    12. Nathalie Percie du Sert & Viki Hurst & Amrita Ahluwalia & Sabina Alam & Marc T Avey & Monya Baker & William J Browne & Alejandra Clark & Innes C Cuthill & Ulrich Dirnagl & Michael Emerson & Paul Garne, 2020. "The ARRIVE guidelines 2.0: Updated guidelines for reporting animal research," PLOS Biology, Public Library of Science, vol. 18(7), pages 1-12, July.
    13. David Baker & Katie Lidster & Ana Sottomayor & Sandra Amor, 2014. "Two Years Later: Journals Are Not Yet Enforcing the ARRIVE Guidelines on Reporting Standards for Pre-Clinical Animal Studies," PLOS Biology, Public Library of Science, vol. 12(1), pages 1-6, January.
    14. Susanne Wieschowski & Diego S Silva & Daniel Strech, 2016. "Animal Study Registries: Results from a Stakeholder Analysis on Potential Strengths, Weaknesses, Facilitators, and Barriers," PLOS Biology, Public Library of Science, vol. 14(11), pages 1-12, November.
    15. Jennifer A Hirst & Jeremy Howick & Jeffrey K Aronson & Nia Roberts & Rafael Perera & Constantinos Koshiaris & Carl Heneghan, 2014. "The Need for Randomization in Animal Trials: An Overview of Systematic Reviews," PLOS ONE, Public Library of Science, vol. 9(6), pages 1-11, June.
    16. Carol Kilkenny & William J Browne & Innes C Cuthill & Michael Emerson & Douglas G Altman, 2010. "Improving Bioscience Research Reporting: The ARRIVE Guidelines for Reporting Animal Research," PLOS Biology, Public Library of Science, vol. 8(6), pages 1-5, June.
    17. Gillian L Currie & Helena N Angel-Scott & Lesley Colvin & Fala Cramond & Kaitlyn Hair & Laila Khandoker & Jing Liao & Malcolm Macleod & Sarah K McCann & Rosie Morland & Nicki Sherratt & Robert Stewart, 2019. "Animal models of chemotherapy-induced peripheral neuropathy: A machine-assisted systematic review and meta-analysis," PLOS Biology, Public Library of Science, vol. 17(5), pages 1-34, May.
    18. Ana Antonic & Emily S Sena & Jennifer S Lees & Taryn E Wills & Peta Skeers & Peter E Batchelor & Malcolm R Macleod & David W Howells, 2013. "Stem Cell Transplantation in Traumatic Spinal Cord Injury: A Systematic Review and Meta-Analysis of Animal Studies," PLOS Biology, Public Library of Science, vol. 11(12), pages 1-14, December.
    19. Malcolm Macleod, 2014. "Some Salt with Your Statin, Professor?," PLOS Biology, Public Library of Science, vol. 12(1), pages 1-3, January.
    20. Julie E. Goodman & Catherine Petito Boyce & Sonja N. Sax & Leslie A. Beyer & Robyn L. Prueitt, 2015. "Rethinking Meta‐Analysis: Applications for Air Pollution Data and Beyond," Risk Analysis, John Wiley & Sons, vol. 35(6), pages 1017-1039, June.

    More about this item

    Statistics

    Access and download statistics

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:plo:pbio00:2000598. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: plosbiology (email available below). General contact details of provider: https://journals.plos.org/plosbiology/ .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.