IDEAS home Printed from https://ideas.repec.org/a/plo/pone00/0183591.html
   My bibliography  Save this article

A checklist is associated with increased quality of reporting preclinical biomedical research: A systematic review

Author

Listed:
  • SeungHye Han
  • Tolani F Olonisakin
  • John P Pribis
  • Jill Zupetic
  • Joo Heung Yoon
  • Kyle M Holleran
  • Kwonho Jeong
  • Nader Shaikh
  • Doris M Rubio
  • Janet S Lee

Abstract

Irreproducibility of preclinical biomedical research has gained recent attention. It is suggested that requiring authors to complete a checklist at the time of manuscript submission would improve the quality and transparency of scientific reporting, and ultimately enhance reproducibility. Whether a checklist enhances quality and transparency in reporting preclinical animal studies, however, has not been empirically studied. Here we searched two highly cited life science journals, one that requires a checklist at submission (Nature) and one that does not (Cell), to identify in vivo animal studies. After screening 943 articles, a total of 80 articles were identified in 2013 (pre-checklist) and 2015 (post-checklist), and included for the detailed evaluation of reporting methodological and analytical information. We compared the quality of reporting preclinical animal studies between the two journals, accounting for differences between journals and changes over time in reporting. We find that reporting of randomization, blinding, and sample-size estimation significantly improved when comparing Nature to Cell from 2013 to 2015, likely due to implementation of a checklist. Specifically, improvement in reporting of the three methodological information was at least three times greater when a mandatory checklist was implemented than when it was not. Reporting the sex of animals and the number of independent experiments performed also improved from 2013 to 2015, likely from factors not related to a checklist. Our study demonstrates that completing a checklist at manuscript submission is associated with improved reporting of key methodological information in preclinical animal studies.

Suggested Citation

  • SeungHye Han & Tolani F Olonisakin & John P Pribis & Jill Zupetic & Joo Heung Yoon & Kyle M Holleran & Kwonho Jeong & Nader Shaikh & Doris M Rubio & Janet S Lee, 2017. "A checklist is associated with increased quality of reporting preclinical biomedical research: A systematic review," PLOS ONE, Public Library of Science, vol. 12(9), pages 1-14, September.
  • Handle: RePEc:plo:pone00:0183591
    DOI: 10.1371/journal.pone.0183591
    as

    Download full text from publisher

    File URL: https://journals.plos.org/plosone/article?id=10.1371/journal.pone.0183591
    Download Restriction: no

    File URL: https://journals.plos.org/plosone/article/file?id=10.1371/journal.pone.0183591&type=printable
    Download Restriction: no

    File URL: https://libkey.io/10.1371/journal.pone.0183591?utm_source=ideas
    LibKey link: if access is restricted and if your library uses this service, LibKey will redirect you to where you can use your library subscription to access this item
    ---><---

    References listed on IDEAS

    as
    1. Aaron Mobley & Suzanne K Linder & Russell Braeuer & Lee M Ellis & Leonard Zwelling, 2013. "A Survey on Data Reproducibility in Cancer Research Provides Insights into Our Limited Ability to Translate Findings from the Laboratory to the Clinic," PLOS ONE, Public Library of Science, vol. 8(5), pages 1-4, May.
    2. David Baker & Katie Lidster & Ana Sottomayor & Sandra Amor, 2014. "Two Years Later: Journals Are Not Yet Enforcing the ARRIVE Guidelines on Reporting Standards for Pre-Clinical Animal Studies," PLOS Biology, Public Library of Science, vol. 12(1), pages 1-6, January.
    3. Monya Baker, 2016. "1,500 scientists lift the lid on reproducibility," Nature, Nature, vol. 533(7604), pages 452-454, May.
    Full references (including those not matched with items on IDEAS)

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Marc T Avey & David Moher & Katrina J Sullivan & Dean Fergusson & Gilly Griffin & Jeremy M Grimshaw & Brian Hutton & Manoj M Lalu & Malcolm Macleod & John Marshall & Shirley H J Mei & Michael Rudnicki, 2016. "The Devil Is in the Details: Incomplete Reporting in Preclinical Animal Research," PLOS ONE, Public Library of Science, vol. 11(11), pages 1-13, November.
    2. Dean A Fergusson & Marc T Avey & Carly C Barron & Mathew Bocock & Kristen E Biefer & Sylvain Boet & Stephane L Bourque & Isidora Conic & Kai Chen & Yuan Yi Dong & Grace M Fox & Ronald B George & Neil , 2019. "Reporting preclinical anesthesia study (REPEAT): Evaluating the quality of reporting in the preclinical anesthesiology literature," PLOS ONE, Public Library of Science, vol. 14(5), pages 1-15, May.
    3. Dennis Bontempi & Leonard Nuernberg & Suraj Pai & Deepa Krishnaswamy & Vamsi Thiriveedhi & Ahmed Hosny & Raymond H. Mak & Keyvan Farahani & Ron Kikinis & Andrey Fedorov & Hugo J. W. L. Aerts, 2024. "End-to-end reproducible AI pipelines in radiology using the cloud," Nature Communications, Nature, vol. 15(1), pages 1-9, December.
    4. Chai, Daniel & Ali, Searat & Brosnan, Mark & Hasso, Tim, 2024. "Understanding researchers' perceptions and experiences in finance research replication studies: A pre-registered report," Pacific-Basin Finance Journal, Elsevier, vol. 86(C).
    5. Natasha A Karp & Terry F Meehan & Hugh Morgan & Jeremy C Mason & Andrew Blake & Natalja Kurbatova & Damian Smedley & Julius Jacobsen & Richard F Mott & Vivek Iyer & Peter Matthews & David G Melvin & S, 2015. "Applying the ARRIVE Guidelines to an In Vivo Database," PLOS Biology, Public Library of Science, vol. 13(5), pages 1-11, May.
    6. Morgan Taschuk & Greg Wilson, 2017. "Ten simple rules for making research software more robust," PLOS Computational Biology, Public Library of Science, vol. 13(4), pages 1-10, April.
    7. Peter Harremoës, 2019. "Replication Papers," Publications, MDPI, vol. 7(3), pages 1-8, July.
    8. Vivian Leung & Frédérik Rousseau-Blass & Guy Beauchamp & Daniel S J Pang, 2018. "ARRIVE has not ARRIVEd: Support for the ARRIVE (Animal Research: Reporting of in vivo Experiments) guidelines does not improve the reporting quality of papers in animal welfare, analgesia or anesthesi," PLOS ONE, Public Library of Science, vol. 13(5), pages 1-13, May.
    9. Fernando Hoces de la Guardia & Sean Grant & Edward Miguel, 2021. "A framework for open policy analysis," Science and Public Policy, Oxford University Press, vol. 48(2), pages 154-163.
    10. Antonella Lanati & Marinella Marzano & Caterina Manzari & Bruno Fosso & Graziano Pesole & Francesca De Leo, 2019. "Management at the service of research: ReOmicS, a quality management system for omics sciences," Palgrave Communications, Palgrave Macmillan, vol. 5(1), pages 1-13, December.
    11. Joel Ferguson & Rebecca Littman & Garret Christensen & Elizabeth Levy Paluck & Nicholas Swanson & Zenan Wang & Edward Miguel & David Birke & John-Henry Pezzuto, 2023. "Survey of open science practices and attitudes in the social sciences," Nature Communications, Nature, vol. 14(1), pages 1-13, December.
    12. Thomas F. Heston, 2024. "Redefining Significance: Robustness and Percent Fragility Indices in Biomedical Research," Stats, MDPI, vol. 7(2), pages 1-12, June.
    13. Erastus Karanja & Aditya Sharma & Ibrahim Salama, 2020. "What does MIS survey research reveal about diversity and representativeness in the MIS field? A content analysis approach," Scientometrics, Springer;Akadémiai Kiadó, vol. 122(3), pages 1583-1628, March.
    14. Bor Luen Tang, 2023. "Some Insights into the Factors Influencing Continuous Citation of Retracted Scientific Papers," Publications, MDPI, vol. 11(4), pages 1-14, October.
    15. Sarah L R Stevens & Mateusz Kuzak & Carlos Martinez & Aurelia Moser & Petra Bleeker & Marc Galland, 2018. "Building a local community of practice in scientific programming for life scientists," PLOS Biology, Public Library of Science, vol. 16(11), pages 1-10, November.
    16. Rosenblatt, Lucas & Herman, Bernease & Holovenko, Anastasia & Lee, Wonkwon & Loftus, Joshua & McKinnie, Elizabeth & Rumezhak, Taras & Stadnik, Andrii & Howe, Bill & Stoyanovich, Julia, 2023. "Epistemic parity: reproducibility as an evaluation metric for differential privacy," LSE Research Online Documents on Economics 120493, London School of Economics and Political Science, LSE Library.
    17. Inga Patarčić & Jadranka Stojanovski, 2022. "Adoption of Transparency and Openness Promotion (TOP) Guidelines across Journals," Publications, MDPI, vol. 10(4), pages 1-10, November.
    18. Susanne Wieschowski & Svenja Biernot & Susanne Deutsch & Silke Glage & André Bleich & René Tolba & Daniel Strech, 2019. "Publication rates in animal research. Extent and characteristics of published and non-published animal studies followed up at two German university medical centres," PLOS ONE, Public Library of Science, vol. 14(11), pages 1-8, November.
    19. Shinichi Nakagawa & Edward R. Ivimey-Cook & Matthew J. Grainger & Rose E. O’Dea & Samantha Burke & Szymon M. Drobniak & Elliot Gould & Erin L. Macartney & April Robin Martinig & Kyle Morrison & Matthi, 2023. "Method Reporting with Initials for Transparency (MeRIT) promotes more granularity and accountability for author contributions," Nature Communications, Nature, vol. 14(1), pages 1-5, December.
    20. Paul J. Ferraro & J. Dustin Tracy, 2022. "A reassessment of the potential for loss-framed incentive contracts to increase productivity: a meta-analysis and a real-effort experiment," Experimental Economics, Springer;Economic Science Association, vol. 25(5), pages 1441-1466, November.

    More about this item

    Statistics

    Access and download statistics

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:plo:pone00:0183591. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: plosone (email available below). General contact details of provider: https://journals.plos.org/plosone/ .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.