IDEAS home Printed from https://ideas.repec.org/a/plo/pone00/0195613.html
   My bibliography  Save this article

Randomly auditing research labs could be an affordable way to improve research quality: A simulation study

Author

Listed:
  • Adrian G Barnett
  • Pauline Zardo
  • Nicholas Graves

Abstract

The “publish or perish” incentive drives many researchers to increase the quantity of their papers at the cost of quality. Lowering quality increases the number of false positive errors which is a key cause of the reproducibility crisis. We adapted a previously published simulation of the research world where labs that produce many papers are more likely to have “child” labs that inherit their characteristics. This selection creates a competitive spiral that favours quantity over quality. To try to halt the competitive spiral we added random audits that could detect and remove labs with a high proportion of false positives, and also improved the behaviour of “child” and “parent” labs who increased their effort and so lowered their probability of making a false positive error. Without auditing, only 0.2% of simulations did not experience the competitive spiral, defined by a convergence to the highest possible false positive probability. Auditing 1.35% of papers avoided the competitive spiral in 71% of simulations, and auditing 1.94% of papers in 95% of simulations. Audits worked best when they were only applied to established labs with 50 or more papers compared with labs with 25 or more papers. Adding a ±20% random error to the number of false positives to simulate peer reviewer error did not reduce the audits’ efficacy. The main benefit of the audits was via the increase in effort in “child” and “parent” labs. Audits improved the literature by reducing the number of false positives from 30.2 per 100 papers to 12.3 per 100 papers. Auditing 1.94% of papers would cost an estimated $15.9 million per year if applied to papers produced by National Institutes of Health funding. Our simulation greatly simplifies the research world and there are many unanswered questions about if and how audits would work that can only be addressed by a trial of an audit.

Suggested Citation

  • Adrian G Barnett & Pauline Zardo & Nicholas Graves, 2018. "Randomly auditing research labs could be an affordable way to improve research quality: A simulation study," PLOS ONE, Public Library of Science, vol. 13(4), pages 1-17, April.
  • Handle: RePEc:plo:pone00:0195613
    DOI: 10.1371/journal.pone.0195613
    as

    Download full text from publisher

    File URL: https://journals.plos.org/plosone/article?id=10.1371/journal.pone.0195613
    Download Restriction: no

    File URL: https://journals.plos.org/plosone/article/file?id=10.1371/journal.pone.0195613&type=printable
    Download Restriction: no

    File URL: https://libkey.io/10.1371/journal.pone.0195613?utm_source=ideas
    LibKey link: if access is restricted and if your library uses this service, LibKey will redirect you to where you can use your library subscription to access this item
    ---><---

    References listed on IDEAS

    as
    1. Mario Biagioli, 2016. "Watch out for cheats in citation game," Nature, Nature, vol. 535(7611), pages 201-201, July.
    2. Paula Stephan & Reinhilde Veugelers & Jian Wang, 2017. "Reviewers are blinkered by bibliometrics," Nature, Nature, vol. 544(7651), pages 411-412, April.
    3. B Ian Hutchins & Xin Yuan & James M Anderson & George M Santangelo, 2016. "Relative Citation Ratio (RCR): A New Metric That Uses Citation Rates to Measure Influence at the Article Level," PLOS Biology, Public Library of Science, vol. 14(9), pages 1-25, September.
    Full references (including those not matched with items on IDEAS)

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Abramo, Giovanni & D'Angelo, Ciriaco Andrea & Grilli, Leonardo, 2021. "The effects of citation-based research evaluation schemes on self-citation behavior," Journal of Informetrics, Elsevier, vol. 15(4).
    2. Groen-Xu, Moqi & Bös, Gregor & Teixeira, Pedro A. & Voigt, Thomas & Knapp, Bernhard, 2023. "Short-term incentives of research evaluations: Evidence from the UK Research Excellence Framework," Research Policy, Elsevier, vol. 52(6).
    3. Corsini, Alberto & Pezzoni, Michele, 2023. "Does grant funding foster research impact? Evidence from France," Journal of Informetrics, Elsevier, vol. 17(4).
    4. Dunaiski, Marcel & Geldenhuys, Jaco & Visser, Willem, 2019. "On the interplay between normalisation, bias, and performance of paper impact metrics," Journal of Informetrics, Elsevier, vol. 13(1), pages 270-290.
    5. A Cecile J W Janssens & Michael Goodman & Kimberly R Powell & Marta Gwinn, 2017. "A critical evaluation of the algorithm behind the Relative Citation Ratio (RCR)," PLOS Biology, Public Library of Science, vol. 15(10), pages 1-5, October.
    6. Marcel Knöchelmann, 2019. "Open Science in the Humanities, or: Open Humanities?," Publications, MDPI, vol. 7(4), pages 1-17, November.
    7. Mohammed S. Alqahtani & Mohamed Abbas & Mohammed Abdul Muqeet & Hussain M. Almohiy, 2022. "Research Productivity in Terms of Output, Impact, and Collaboration for University Researchers in Saudi Arabia: SciVal Analytics and t -Tests Statistical Based Approach," Sustainability, MDPI, vol. 14(23), pages 1-21, December.
    8. Thelwall, Mike, 2018. "Dimensions: A competitor to Scopus and the Web of Science?," Journal of Informetrics, Elsevier, vol. 12(2), pages 430-435.
    9. Li, Heyang & Wu, Meijun & Wang, Yougui & Zeng, An, 2022. "Bibliographic coupling networks reveal the advantage of diversification in scientific projects," Journal of Informetrics, Elsevier, vol. 16(3).
    10. Yang, Alex Jie & Wu, Linwei & Zhang, Qi & Wang, Hao & Deng, Sanhong, 2023. "The k-step h-index in citation networks at the paper, author, and institution levels," Journal of Informetrics, Elsevier, vol. 17(4).
    11. Edré Moreira & Wagner Meira & Marcos André Gonçalves & Alberto H. F. Laender, 2023. "The rise of hyperprolific authors in computer science: characterization and implications," Scientometrics, Springer;Akadémiai Kiadó, vol. 128(5), pages 2945-2974, May.
    12. Lanu Kim & Jason H. Portenoy & Jevin D. West & Katherine W. Stovel, 2020. "Scientific journals still matter in the era of academic search engines and preprint archives," Journal of the Association for Information Science & Technology, Association for Information Science & Technology, vol. 71(10), pages 1218-1226, October.
    13. Corrêa Jr., Edilson A. & Silva, Filipi N. & da F. Costa, Luciano & Amancio, Diego R., 2017. "Patterns of authors contribution in scientific manuscripts," Journal of Informetrics, Elsevier, vol. 11(2), pages 498-510.
    14. Teresa Elizabeth Stone & Jane Conway, 2017. "Editorial: Self‐plagiarism prevention and management at Nursing & Health Sciences," Nursing & Health Sciences, John Wiley & Sons, vol. 19(1), pages 1-4, March.
    15. Torres-Salinas, Daniel & Valderrama-Baca, Pilar & Arroyo-Machado, Wenceslao, 2022. "Is there a need for a new journal metric? Correlations between JCR Impact Factor metrics and the Journal Citation Indicator—JCI," Journal of Informetrics, Elsevier, vol. 16(3).
    16. Joseph Gerald Hirschberg & Jeanette Ngaire Lye, 2020. "Grading Journals In Economics: The Abcs Of The Abdc," Journal of Economic Surveys, Wiley Blackwell, vol. 34(4), pages 876-921, September.
    17. Timur Gareev & Irina Peker, 2023. "Quantity versus quality in publication activity: knowledge production at the regional level," Papers 2311.08830, arXiv.org.
    18. Jaime A. Teixeira da Silva & Quan-Hoang Vuong, 2021. "The right to refuse unwanted citations: rethinking the culture of science around the citation," Scientometrics, Springer;Akadémiai Kiadó, vol. 126(6), pages 5355-5360, June.
    19. Joseph Staudt & Huifeng Yu & Robert P Light & Gerald Marschke & Katy Börner & Bruce A Weinberg, 2018. "High-impact and transformative science (HITS) metrics: Definition, exemplification, and comparison," PLOS ONE, Public Library of Science, vol. 13(7), pages 1-23, July.
    20. Heng Huang & Donghua Zhu & Xuefeng Wang, 2022. "Evaluating scientific impact of publications: combining citation polarity and purpose," Scientometrics, Springer;Akadémiai Kiadó, vol. 127(9), pages 5257-5281, September.

    More about this item

    Statistics

    Access and download statistics

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:plo:pone00:0195613. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: plosone (email available below). General contact details of provider: https://journals.plos.org/plosone/ .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.