IDEAS home Printed from https://ideas.repec.org/a/plo/pone00/0046054.html
   My bibliography  Save this article

Peer Review of Grant Applications: Criteria Used and Qualitative Study of Reviewer Practices

Author

Listed:
  • Hendy Abdoul
  • Christophe Perrey
  • Philippe Amiel
  • Florence Tubach
  • Serge Gottot
  • Isabelle Durand-Zaleski
  • Corinne Alberti

Abstract

Background: Peer review of grant applications has been criticized as lacking reliability. Studies showing poor agreement among reviewers supported this possibility but usually focused on reviewers’ scores and failed to investigate reasons for disagreement. Here, our goal was to determine how reviewers rate applications, by investigating reviewer practices and grant assessment criteria. Methods and Findings: We first collected and analyzed a convenience sample of French and international calls for proposals and assessment guidelines, from which we created an overall typology of assessment criteria comprising nine domains relevance to the call for proposals, usefulness, originality, innovativeness, methodology, feasibility, funding, ethical aspects, and writing of the grant application. We then performed a qualitative study of reviewer practices, particularly regarding the use of assessment criteria, among reviewers of the French Academic Hospital Research Grant Agencies (Programmes Hospitaliers de Recherche Clinique, PHRCs). Semi-structured interviews and observation sessions were conducted. Both the time spent assessing each grant application and the assessment methods varied across reviewers. The assessment criteria recommended by the PHRCs were listed by all reviewers as frequently evaluated and useful. However, use of the PHRC criteria was subjective and varied across reviewers. Some reviewers gave the same weight to each assessment criterion, whereas others considered originality to be the most important criterion (12/34), followed by methodology (10/34) and feasibility (4/34). Conceivably, this variability might adversely affect the reliability of the review process, and studies evaluating this hypothesis would be of interest. Conclusions: Variability across reviewers may result in mistrust among grant applicants about the review process. Consequently, ensuring transparency is of the utmost importance. Consistency in the review process could also be improved by providing common definitions for each assessment criterion and uniform requirements for grant application submissions. Further research is needed to assess the feasibility and acceptability of these measures.

Suggested Citation

  • Hendy Abdoul & Christophe Perrey & Philippe Amiel & Florence Tubach & Serge Gottot & Isabelle Durand-Zaleski & Corinne Alberti, 2012. "Peer Review of Grant Applications: Criteria Used and Qualitative Study of Reviewer Practices," PLOS ONE, Public Library of Science, vol. 7(9), pages 1-15, September.
  • Handle: RePEc:plo:pone00:0046054
    DOI: 10.1371/journal.pone.0046054
    as

    Download full text from publisher

    File URL: https://journals.plos.org/plosone/article?id=10.1371/journal.pone.0046054
    Download Restriction: no

    File URL: https://journals.plos.org/plosone/article/file?id=10.1371/journal.pone.0046054&type=printable
    Download Restriction: no

    File URL: https://libkey.io/10.1371/journal.pone.0046054?utm_source=ideas
    LibKey link: if access is restricted and if your library uses this service, LibKey will redirect you to where you can use your library subscription to access this item
    ---><---

    References listed on IDEAS

    as
    1. Meike Olbrecht & Lutz Bornmann, 2010. "Panel peer review of grant applications: what do we know from research in social psychology on judgment and decision-making in groups?," Research Evaluation, Oxford University Press, vol. 19(4), pages 293-304, October.
    2. Lutz Bornmann & Gerlind Wallon & Anna Ledin, 2008. "Does the Committee Peer Review Select the Best Applicants for Funding? An Investigation of the Selection Process for Two European Molecular Biology Organization Programmes," PLOS ONE, Public Library of Science, vol. 3(10), pages 1-11, October.
    3. Michael L Callaham & John Tercier, 2007. "The Relationship of Previous Training and Experience of Journal Peer Reviewers to Subsequent Review Quality," PLOS Medicine, Public Library of Science, vol. 4(1), pages 1-9, January.
    4. Sutherland, H. J. & Meslin, E. M. & da Cunha, R. & Till, J. E., 1993. "Judging clinical research questions: What criteria are used?," Social Science & Medicine, Elsevier, vol. 37(12), pages 1427-1430, December.
    5. Hendy Abdoul & Christophe Perrey & Florence Tubach & Philippe Amiel & Isabelle Durand-Zaleski & Corinne Alberti, 2012. "Non-Financial Conflicts of Interest in Academic Grant Evaluation: A Qualitative Study of Multiple Stakeholders in France," PLOS ONE, Public Library of Science, vol. 7(4), pages 1-10, April.
    6. Bornmann, Lutz & Daniel, Hans-Dieter, 2007. "Gatekeepers of science—Effects of external reviewers’ attributes on the assessments of fellowship applications," Journal of Informetrics, Elsevier, vol. 1(1), pages 83-91.
    7. The PLoS Medicine Editors, 2007. "Peer Review in PLoS Medicine," PLOS Medicine, Public Library of Science, vol. 4(1), pages 1-2, January.
    Full references (including those not matched with items on IDEAS)

    Citations

    Citations are extracted by the CitEc Project, subscribe to its RSS feed for this item.
    as


    Cited by:

    1. David Gurwitz & Elena Milanesi & Thomas Koenig, 2014. "Grant Application Review: The Case of Transparency," PLOS Biology, Public Library of Science, vol. 12(12), pages 1-6, December.
    2. Xiaoyu Liu & Xuefeng Wang & Donghua Zhu, 2022. "Reviewer recommendation method for scientific research proposals: a case for NSFC," Scientometrics, Springer;Akadémiai Kiadó, vol. 127(6), pages 3343-3366, June.
    3. Gill, Chelsea & Mehrotra, Vishal & Moses, Olayinka & Bui, Binh, 2024. "The impact of the Pitching Research Framework on AFAANZ grant applications: A pre-registered study," Pacific-Basin Finance Journal, Elsevier, vol. 84(C).
    4. Seeber, Marco & Alon, Ilan & Pina, David G. & Piro, Fredrik Niclas & Seeber, Michele, 2022. "Predictors of applying for and winning an ERC Proof-of-Concept grant: An automated machine learning model," Technological Forecasting and Social Change, Elsevier, vol. 184(C).
    5. David G Pina & Darko Hren & Ana Marušić, 2015. "Peer Review Evaluation Process of Marie Curie Actions under EU’s Seventh Framework Programme for Research," PLOS ONE, Public Library of Science, vol. 10(6), pages 1-15, June.
    6. Feliciani, Thomas & Morreau, Michael & Luo, Junwen & Lucas, Pablo & Shankar, Kalpana, 2022. "Designing grant-review panels for better funding decisions: Lessons from an empirically calibrated simulation model," Research Policy, Elsevier, vol. 51(4).
    7. Richard R Snell, 2015. "Menage a Quoi? Optimal Number of Peer Reviewers," PLOS ONE, Public Library of Science, vol. 10(4), pages 1-14, April.
    8. Sven E. Hug & Mirjam Aeschbach, 2020. "Criteria for assessing grant applications: a systematic review," Palgrave Communications, Palgrave Macmillan, vol. 6(1), pages 1-15, December.
    9. Gill, Chelsea & Mehrotra, Vishal & Moses, Olayinka & Bui, Binh, 2023. "The impact of the pitching research framework on AFAANZ grant applications," Pacific-Basin Finance Journal, Elsevier, vol. 77(C).
    10. Rebecca Abma-Schouten & Joey Gijbels & Wendy Reijmerink & Ingeborg Meijer, 2023. "Evaluation of research proposals by peer review panels: broader panels for broader assessments?," Science and Public Policy, Oxford University Press, vol. 50(4), pages 619-632.
    11. Emre Özel, 2024. "What is Gender Bias in Grant Peer review?," Working Papers halshs-03862027, HAL.

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Hendy Abdoul & Christophe Perrey & Florence Tubach & Philippe Amiel & Isabelle Durand-Zaleski & Corinne Alberti, 2012. "Non-Financial Conflicts of Interest in Academic Grant Evaluation: A Qualitative Study of Multiple Stakeholders in France," PLOS ONE, Public Library of Science, vol. 7(4), pages 1-10, April.
    2. van den Besselaar, Peter & Sandström, Ulf, 2015. "Early career grants, performance, and careers: A study on predictive validity of grant decisions," Journal of Informetrics, Elsevier, vol. 9(4), pages 826-838.
    3. Jens Jirschitzka & Aileen Oeberst & Richard Göllner & Ulrike Cress, 2017. "Inter-rater reliability and validity of peer reviews in an interdisciplinary field," Scientometrics, Springer;Akadémiai Kiadó, vol. 113(2), pages 1059-1092, November.
    4. Gaëlle Vallée-Tourangeau & Ana Wheelock & Tushna Vandrevala & Priscilla Harries, 2022. "Peer reviewers’ dilemmas: a qualitative exploration of decisional conflict in the evaluation of grant applications in the medical humanities and social sciences," Palgrave Communications, Palgrave Macmillan, vol. 9(1), pages 1-11, December.
    5. José Luis Ortega, 2017. "Are peer-review activities related to reviewer bibliometric performance? A scientometric analysis of Publons," Scientometrics, Springer;Akadémiai Kiadó, vol. 112(2), pages 947-962, August.
    6. Bornmann, Lutz & Mutz, Rüdiger & Daniel, Hans-Dieter, 2010. "The h index research output measurement: Two approaches to enhance its accuracy," Journal of Informetrics, Elsevier, vol. 4(3), pages 407-414.
    7. Annita Nugent & Ho Fai Chan & Uwe Dulleck, 2022. "Government funding of university-industry collaboration: exploring the impact of targeted funding on university patent activity," Scientometrics, Springer;Akadémiai Kiadó, vol. 127(1), pages 29-73, January.
    8. Marcin Kozak & Lutz Bornmann, 2012. "A New Family of Cumulative Indexes for Measuring Scientific Performance," PLOS ONE, Public Library of Science, vol. 7(10), pages 1-4, October.
    9. Khaled Shawwa & Romy Kallas & Serge Koujanian & Arnav Agarwal & Ignacio Neumann & Paul Alexander & Kari A O Tikkinen & Gordon Guyatt & Elie A Akl, 2016. "Requirements of Clinical Journals for Authors’ Disclosure of Financial and Non-Financial Conflicts of Interest: A Cross Sectional Study," PLOS ONE, Public Library of Science, vol. 11(3), pages 1-12, March.
    10. L. Erik Clavería & Eliseo Guallar & Jordi Camí & José Conde & Roberto Pastor & José R. Ricoy & Eduardo Rodríguez-Farré & Fernando Ruiz-Palomo & Emilio Muñoz, 2000. "Does Peer Review Predict the Performance of Research Projects in Health Sciences?," Scientometrics, Springer;Akadémiai Kiadó, vol. 47(1), pages 11-23, January.
    11. Grażyna Wieczorkowska & Katarzyna Kowalczyk, 2021. "Ensuring Sustainable Evaluation: How to Improve Quality of Evaluating Grant Proposals?," Sustainability, MDPI, vol. 13(5), pages 1-11, March.
    12. Lawson, Cornelia & Salter, Ammon, 2023. "Exploring the effect of overlapping institutional applications on panel decision-making," Research Policy, Elsevier, vol. 52(9).
    13. Feliciani, Thomas & Morreau, Michael & Luo, Junwen & Lucas, Pablo & Shankar, Kalpana, 2022. "Designing grant-review panels for better funding decisions: Lessons from an empirically calibrated simulation model," Research Policy, Elsevier, vol. 51(4).
    14. Bornmann, Lutz & Daniel, Hans-Dieter, 2009. "Extent of type I and type II errors in editorial decisions: A case study on Angewandte Chemie International Edition," Journal of Informetrics, Elsevier, vol. 3(4), pages 348-352.
    15. Stephen A Gallo & Joanne H Sullivan & Scott R Glisson, 2016. "The Influence of Peer Reviewer Expertise on the Evaluation of Research Funding Applications," PLOS ONE, Public Library of Science, vol. 11(10), pages 1-18, October.
    16. Eric Libby & Leon Glass, 2010. "The Calculus of Committee Composition," PLOS ONE, Public Library of Science, vol. 5(9), pages 1-8, September.
    17. Yu-Wei Chang & Dar-Zen Chen & Mu-Hsuan Huang, 2021. "Do extraordinary science and technology scientists balance their publishing and patenting activities?," PLOS ONE, Public Library of Science, vol. 16(11), pages 1-20, November.
    18. Kok, Holmer & Faems, Dries & de Faria, Pedro, 2022. "Pork Barrel or Barrel of Gold? Examining the performance implications of earmarking in public R&D grants," Research Policy, Elsevier, vol. 51(7).
    19. Kevin W. Boyack & Caleb Smith & Richard Klavans, 2018. "Toward predicting research proposal success," Scientometrics, Springer;Akadémiai Kiadó, vol. 114(2), pages 449-461, February.
    20. Bornmann, Lutz & Leydesdorff, Loet & Van den Besselaar, Peter, 2010. "A meta-evaluation of scientific research proposals: Different ways of comparing rejected to awarded applications," Journal of Informetrics, Elsevier, vol. 4(3), pages 211-220.

    More about this item

    Statistics

    Access and download statistics

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:plo:pone00:0046054. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: plosone (email available below). General contact details of provider: https://journals.plos.org/plosone/ .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.