IDEAS home Printed from https://ideas.repec.org/a/plo/pone00/0147913.html
   My bibliography  Save this article

Peer Review Quality and Transparency of the Peer-Review Process in Open Access and Subscription Journals

Author

Listed:
  • Jelte M Wicherts

Abstract

Background: Recent controversies highlighting substandard peer review in Open Access (OA) and traditional (subscription) journals have increased the need for authors, funders, publishers, and institutions to assure quality of peer-review in academic journals. I propose that transparency of the peer-review process may be seen as an indicator of the quality of peer-review, and develop and validate a tool enabling different stakeholders to assess transparency of the peer-review process. Methods and Findings: Based on editorial guidelines and best practices, I developed a 14-item tool to rate transparency of the peer-review process on the basis of journals’ websites. In Study 1, a random sample of 231 authors of papers in 92 subscription journals in different fields rated transparency of the journals that published their work. Authors’ ratings of the transparency were positively associated with quality of the peer-review process but unrelated to journal’s impact factors. In Study 2, 20 experts on OA publishing assessed the transparency of established (non-OA) journals, OA journals categorized as being published by potential predatory publishers, and journals from the Directory of Open Access Journals (DOAJ). Results show high reliability across items (α = .91) and sufficient reliability across raters. Ratings differentiated the three types of journals well. In Study 3, academic librarians rated a random sample of 140 DOAJ journals and another 54 journals that had received a hoax paper written by Bohannon to test peer-review quality. Journals with higher transparency ratings were less likely to accept the flawed paper and showed higher impact as measured by the h5 index from Google Scholar. Conclusions: The tool to assess transparency of the peer-review process at academic journals shows promising reliability and validity. The transparency of the peer-review process can be seen as an indicator of peer-review quality allowing the tool to be used to predict academic quality in new journals.

Suggested Citation

  • Jelte M Wicherts, 2016. "Peer Review Quality and Transparency of the Peer-Review Process in Open Access and Subscription Journals," PLOS ONE, Public Library of Science, vol. 11(1), pages 1-19, January.
  • Handle: RePEc:plo:pone00:0147913
    DOI: 10.1371/journal.pone.0147913
    as

    Download full text from publisher

    File URL: https://journals.plos.org/plosone/article?id=10.1371/journal.pone.0147913
    Download Restriction: no

    File URL: https://journals.plos.org/plosone/article/file?id=10.1371/journal.pone.0147913&type=printable
    Download Restriction: no

    File URL: https://libkey.io/10.1371/journal.pone.0147913?utm_source=ideas
    LibKey link: if access is restricted and if your library uses this service, LibKey will redirect you to where you can use your library subscription to access this item
    ---><---

    References listed on IDEAS

    as
    1. Rinia, E. J. & van Leeuwen, Th. N. & van Vuren, H. G. & van Raan, A. F. J., 1998. "Comparative analysis of a set of bibliometric indicators and central peer review criteria: Evaluation of condensed matter physics in the Netherlands," Research Policy, Elsevier, vol. 27(1), pages 95-107, May.
    Full references (including those not matched with items on IDEAS)

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Takanori Ida & Naomi Fukuzawa, 2013. "Effects of large-scale research funding programs: a Japanese case study," Scientometrics, Springer;Akadémiai Kiadó, vol. 94(3), pages 1253-1273, March.
    2. Giovanni Abramo & Ciriaco Andrea D’Angelo & Marco Solazzi, 2010. "National research assessment exercises: a measure of the distortion of performance rankings when labor input is treated as uniform," Scientometrics, Springer;Akadémiai Kiadó, vol. 84(3), pages 605-619, September.
    3. van Raan, A. F. J. & van Leeuwen, Th. N., 2002. "Assessment of the scientific basis of interdisciplinary, applied research: Application of bibliometric methods in Nutrition and Food Research," Research Policy, Elsevier, vol. 31(4), pages 611-632, May.
    4. Abramo, Giovanni & D'Angelo, Ciriaco Andrea & Caprasecca, Alessandro, 2009. "Allocative efficiency in public research funding: Can bibliometrics help?," Research Policy, Elsevier, vol. 38(1), pages 206-215, February.
    5. Giovanni Abramo & Ciriaco Andrea D’Angelo, 2011. "Evaluating research: from informed peer review to bibliometrics," Scientometrics, Springer;Akadémiai Kiadó, vol. 87(3), pages 499-514, June.
    6. Groot, Tom & Garcia-Valderrama, Teresa, 2006. "Research quality and efficiency: An analysis of assessments and management issues in Dutch economics and business research programs," Research Policy, Elsevier, vol. 35(9), pages 1362-1376, November.
    7. Thelwall, Mike & Fairclough, Ruth, 2015. "The influence of time and discipline on the magnitude of correlations between citation counts and quality scores," Journal of Informetrics, Elsevier, vol. 9(3), pages 529-541.
    8. Young-Don Cho & Hoo-Gon Choi, 2013. "Principal parameters affecting R&D exploitation of nanotechnology research: a case for Korea," Scientometrics, Springer;Akadémiai Kiadó, vol. 96(3), pages 881-899, September.
    9. Bruno S. Frey & Katja Rost, 2010. "Do rankings reflect research quality?," Journal of Applied Economics, Universidad del CEMA, vol. 13, pages 1-38, May.
    10. Alberto Baccini & Giuseppe De Nicolao, 2016. "Do they agree? Bibliometric evaluation versus informed peer review in the Italian research assessment exercise," Scientometrics, Springer;Akadémiai Kiadó, vol. 108(3), pages 1651-1671, September.
    11. Ludo Waltman & Nees Jan Eck & Thed N. Leeuwen & Martijn S. Visser & Anthony F. J. Raan, 2011. "On the correlation between bibliometric indicators and peer review: reply to Opthof and Leydesdorff," Scientometrics, Springer;Akadémiai Kiadó, vol. 88(3), pages 1017-1022, September.
    12. Ulrich Schmoch & Torben Schubert, 2008. "Are international co-publications an indicator for quality of scientific research?," Scientometrics, Springer;Akadémiai Kiadó, vol. 74(3), pages 361-377, March.
    13. Chen, Kaihua & Guan, Jiancheng, 2011. "A bibliometric investigation of research performance in emerging nanobiopharmaceuticals," Journal of Informetrics, Elsevier, vol. 5(2), pages 233-247.
    14. Sigifredo Laengle & José M. Merigó & Nikunja Mohan Modak & Jian-Bo Yang, 2020. "Bibliometrics in operations research and management science: a university analysis," Annals of Operations Research, Springer, vol. 294(1), pages 769-813, November.
    15. Abramo, Giovanni & Cicero, Tindaro & D’Angelo, Ciriaco Andrea, 2011. "Assessing the varying level of impact measurement accuracy as a function of the citation window length," Journal of Informetrics, Elsevier, vol. 5(4), pages 659-667.
    16. Jacques Wainer & Paula Vieira, 2013. "Correlations between bibliometrics and peer evaluation for all disciplines: the evaluation of Brazilian scientists," Scientometrics, Springer;Akadémiai Kiadó, vol. 96(2), pages 395-410, August.
    17. Peng Wang & Fang-Wei Zhu & Hao-Yang Song & Jian-Hua Hou & Jin-Lan Zhang, 2018. "Visualizing the Academic Discipline of Knowledge Management," Sustainability, MDPI, vol. 10(3), pages 1-28, March.
    18. Giovanni Abramo & Ciriaco Andrea D’Angelo & Emanuela Reale, 2019. "Peer review versus bibliometrics: Which method better predicts the scholarly impact of publications?," Scientometrics, Springer;Akadémiai Kiadó, vol. 121(1), pages 537-554, October.
    19. Kayvan Kousha & Mike Thelwall, 2024. "Factors associating with or predicting more cited or higher quality journal articles: An Annual Review of Information Science and Technology (ARIST) paper," Journal of the Association for Information Science & Technology, Association for Information Science & Technology, vol. 75(3), pages 215-244, March.
    20. Giovanni Abramo & Ciriaco Andrea D’Angelo & Flavia Di Costa, 2011. "National research assessment exercises: a comparison of peer review and bibliometrics rankings," Scientometrics, Springer;Akadémiai Kiadó, vol. 89(3), pages 929-941, December.

    More about this item

    Statistics

    Access and download statistics

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:plo:pone00:0147913. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: plosone (email available below). General contact details of provider: https://journals.plos.org/plosone/ .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.