IDEAS home Printed from https://ideas.repec.org/a/plo/pone00/0120838.html
   My bibliography  Save this article

Menage a Quoi? Optimal Number of Peer Reviewers

Author

Listed:
  • Richard R Snell

Abstract

Peer review represents the primary mechanism used by funding agencies to allocate financial support and by journals to select manuscripts for publication, yet recent Cochrane reviews determined literature on peer review best practice is sparse. Key to improving the process are reduction of inherent vulnerability to high degree of randomness and, from an economic perspective, limiting both the substantial indirect costs related to reviewer time invested and direct administrative costs to funding agencies, publishers and research institutions. Use of additional reviewers per application may increase reliability and decision consistency, but adds to overall cost and burden. The optimal number of reviewers per application, while not known, is thought to vary with accuracy of judges or evaluation methods. Here I use bootstrapping of replicated peer review data from a Post-doctoral Fellowships competition to show that five reviewers per application represents a practical optimum which avoids large random effects evident when fewer reviewers are used, a point where additional reviewers at increasing cost provides only diminishing incremental gains in chance-corrected consistency of decision outcomes. Random effects were most evident in the relative mid-range of competitiveness. Results support aggressive high- and low-end stratification or triaging of applications for subsequent stages of review, with the proportion and set of mid-range submissions to be retained for further consideration being dependent on overall success rate.

Suggested Citation

  • Richard R Snell, 2015. "Menage a Quoi? Optimal Number of Peer Reviewers," PLOS ONE, Public Library of Science, vol. 10(4), pages 1-14, April.
  • Handle: RePEc:plo:pone00:0120838
    DOI: 10.1371/journal.pone.0120838
    as

    Download full text from publisher

    File URL: https://journals.plos.org/plosone/article?id=10.1371/journal.pone.0120838
    Download Restriction: no

    File URL: https://journals.plos.org/plosone/article/file?id=10.1371/journal.pone.0120838&type=printable
    Download Restriction: no

    File URL: https://libkey.io/10.1371/journal.pone.0120838?utm_source=ideas
    LibKey link: if access is restricted and if your library uses this service, LibKey will redirect you to where you can use your library subscription to access this item
    ---><---

    References listed on IDEAS

    as
    1. Squazzoni, Flaminio & Bravo, Giangiacomo & Takács, Károly, 2013. "Does incentive provision increase the quality of peer review? An experimental study," Research Policy, Elsevier, vol. 42(1), pages 287-294.
    2. Hendy Abdoul & Christophe Perrey & Philippe Amiel & Florence Tubach & Serge Gottot & Isabelle Durand-Zaleski & Corinne Alberti, 2012. "Peer Review of Grant Applications: Criteria Used and Qualitative Study of Reviewer Practices," PLOS ONE, Public Library of Science, vol. 7(9), pages 1-15, September.
    3. J Britt Holbrook & Robert Frodeman, 2011. "Peer review and the ex ante assessment of societal impacts," Research Evaluation, Oxford University Press, vol. 20(3), pages 239-246, September.
    4. Michael Obrecht & Karl Tibelius & Guy D'Aloisio, 2007. "Examining the value added by committee discussion in the review of applications for research awards," Research Evaluation, Oxford University Press, vol. 16(2), pages 79-91, June.
    5. Lutz Bornmann & Gerlind Wallon & Anna Ledin, 2008. "Does the Committee Peer Review Select the Best Applicants for Funding? An Investigation of the Selection Process for Two European Molecular Biology Organization Programmes," PLOS ONE, Public Library of Science, vol. 3(10), pages 1-11, October.
    6. Carole J. Lee & Cassidy R. Sugimoto & Guo Zhang & Blaise Cronin, 2013. "Bias in peer review," Journal of the American Society for Information Science and Technology, Association for Information Science & Technology, vol. 64(1), pages 2-17, January.
    7. Benda, Wim G.G. & Engels, Tim C.E., 2011. "The predictive validity of peer review: A selective review of the judgmental forecasting qualities of peers, and implications for innovation in science," International Journal of Forecasting, Elsevier, vol. 27(1), pages 166-182.
    8. Liv Langfeldt, 2006. "The policy challenges of peer review: managing bias, conflict of interests and interdisciplinary assessments," Research Evaluation, Oxford University Press, vol. 15(1), pages 31-41, April.
    9. Paul J Roebber & David M Schultz, 2011. "Peer Review, Program Officers and Science Funding," PLOS ONE, Public Library of Science, vol. 6(4), pages 1-6, April.
    10. Terttu Luukkonen, 2012. "Conservatism and risk-taking in peer review: Emerging ERC practices," Research Evaluation, Oxford University Press, vol. 21(1), pages 48-60, February.
    11. David Kaplan & Nicola Lacetera & Celia Kaplan, 2008. "Sample Size and Precision in NIH Peer Review," PLOS ONE, Public Library of Science, vol. 3(7), pages 1-3, July.
    12. Peres-Neto, Pedro R. & Jackson, Donald A. & Somers, Keith M., 2005. "How many principal components? stopping rules for determining the number of non-trivial axes revisited," Computational Statistics & Data Analysis, Elsevier, vol. 49(4), pages 974-997, June.
    13. Lipworth, Wendy L. & Kerridge, Ian H. & Carter, Stacy M. & Little, Miles, 2011. "Journal peer review in context: A qualitative study of the social and subjective dimensions of manuscript review in biomedical publishing," Social Science & Medicine, Elsevier, vol. 72(7), pages 1056-1063, April.
    14. Koehler, Elizabeth & Brown, Elizabeth & Haneuse, Sebastien J.-P. A., 2009. "On the Assessment of Monte Carlo Error in Simulation-Based Statistical Analyses," The American Statistician, American Statistical Association, vol. 63(2), pages 155-162.
    15. Benda, Wim G.G. & Engels, Tim C.E., 2011. "The predictive validity of peer review: A selective review of the judgmental forecasting qualities of peers, and implications for innovation in science," International Journal of Forecasting, Elsevier, vol. 27(1), pages 166-182, January.
    16. Carole J. Lee & Cassidy R. Sugimoto & Guo Zhang & Blaise Cronin, 2013. "Bias in peer review," Journal of the Association for Information Science & Technology, Association for Information Science & Technology, vol. 64(1), pages 2-17, January.
    17. Lutz Bornmann & Hans-Dieter Daniel, 2006. "Potential sources of bias in research fellowship assessments: effects of university prestige and field of study," Research Evaluation, Oxford University Press, vol. 15(3), pages 209-219, December.
    Full references (including those not matched with items on IDEAS)

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Feliciani, Thomas & Morreau, Michael & Luo, Junwen & Lucas, Pablo & Shankar, Kalpana, 2022. "Designing grant-review panels for better funding decisions: Lessons from an empirically calibrated simulation model," Research Policy, Elsevier, vol. 51(4).
    2. Thomas Feliciani & Junwen Luo & Lai Ma & Pablo Lucas & Flaminio Squazzoni & Ana Marušić & Kalpana Shankar, 2019. "A scoping review of simulation models of peer review," Scientometrics, Springer;Akadémiai Kiadó, vol. 121(1), pages 555-594, October.
    3. Jens Jirschitzka & Aileen Oeberst & Richard Göllner & Ulrike Cress, 2017. "Inter-rater reliability and validity of peer reviews in an interdisciplinary field," Scientometrics, Springer;Akadémiai Kiadó, vol. 113(2), pages 1059-1092, November.
    4. Carole J. Lee & Cassidy R. Sugimoto & Guo Zhang & Blaise Cronin, 2013. "Bias in peer review," Journal of the Association for Information Science & Technology, Association for Information Science & Technology, vol. 64(1), pages 2-17, January.
    5. Sven E. Hug & Mirjam Aeschbach, 2020. "Criteria for assessing grant applications: a systematic review," Palgrave Communications, Palgrave Macmillan, vol. 6(1), pages 1-15, December.
    6. Marco Seeber & Alberto Bacchelli, 2017. "Does single blind peer review hinder newcomers?," Scientometrics, Springer;Akadémiai Kiadó, vol. 113(1), pages 567-585, October.
    7. van den Besselaar, Peter & Sandström, Ulf, 2015. "Early career grants, performance, and careers: A study on predictive validity of grant decisions," Journal of Informetrics, Elsevier, vol. 9(4), pages 826-838.
    8. Rodríguez Sánchez, Isabel & Makkonen, Teemu & Williams, Allan M., 2019. "Peer review assessment of originality in tourism journals: critical perspective of key gatekeepers," Annals of Tourism Research, Elsevier, vol. 77(C), pages 1-11.
    9. Zhentao Liang & Jin Mao & Gang Li, 2023. "Bias against scientific novelty: A prepublication perspective," Journal of the Association for Information Science & Technology, Association for Information Science & Technology, vol. 74(1), pages 99-114, January.
    10. Seeber, Marco & Alon, Ilan & Pina, David G. & Piro, Fredrik Niclas & Seeber, Michele, 2022. "Predictors of applying for and winning an ERC Proof-of-Concept grant: An automated machine learning model," Technological Forecasting and Social Change, Elsevier, vol. 184(C).
    11. Kok, Holmer & Faems, Dries & de Faria, Pedro, 2022. "Pork Barrel or Barrel of Gold? Examining the performance implications of earmarking in public R&D grants," Research Policy, Elsevier, vol. 51(7).
    12. Gemma Elizabeth Derrick & Alessandra Zimmermann & Helen Greaves & Jonathan Best & Richard Klavans, 2024. "Targeted, actionable and fair: Reviewer reports as feedback and its effect on ECR career choices," Research Evaluation, Oxford University Press, vol. 32(4), pages 648-657.
    13. Benda, Wim G.G. & Engels, Tim C.E., 2011. "The predictive validity of peer review: A selective review of the judgmental forecasting qualities of peers, and implications for innovation in science," International Journal of Forecasting, Elsevier, vol. 27(1), pages 166-182.
    14. Dag W. Aksnes & Liv Langfeldt & Paul Wouters, 2019. "Citations, Citation Indicators, and Research Quality: An Overview of Basic Concepts and Theories," SAGE Open, , vol. 9(1), pages 21582440198, February.
    15. Sergio Copiello, 2018. "On the money value of peer review," Scientometrics, Springer;Akadémiai Kiadó, vol. 115(1), pages 613-620, April.
    16. Dennis L Murray & Douglas Morris & Claude Lavoie & Peter R Leavitt & Hugh MacIsaac & Michael E J Masson & Marc-Andre Villard, 2016. "Bias in Research Grant Evaluation Has Dire Consequences for Small Universities," PLOS ONE, Public Library of Science, vol. 11(6), pages 1-19, June.
    17. Lutz Bornmann, 2012. "The Hawthorne effect in journal peer review," Scientometrics, Springer;Akadémiai Kiadó, vol. 91(3), pages 857-862, June.
    18. Gaëlle Vallée-Tourangeau & Ana Wheelock & Tushna Vandrevala & Priscilla Harries, 2022. "Peer reviewers’ dilemmas: a qualitative exploration of decisional conflict in the evaluation of grant applications in the medical humanities and social sciences," Palgrave Communications, Palgrave Macmillan, vol. 9(1), pages 1-11, December.
    19. García, J.A. & Montero-Parodi, J.J. & Rodriguez-Sánchez, Rosa & Fdez-Valdivia, J., 2023. "How to motivate a reviewer with a present bias to work harder," Journal of Informetrics, Elsevier, vol. 17(4).
    20. Oviedo-García, M. Ángeles, 2016. "Tourism research quality: Reviewing and assessing interdisciplinarity," Tourism Management, Elsevier, vol. 52(C), pages 586-592.

    More about this item

    Statistics

    Access and download statistics

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:plo:pone00:0120838. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: plosone (email available below). General contact details of provider: https://journals.plos.org/plosone/ .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.