IDEAS home Printed from https://ideas.repec.org/a/pal/palcom/v6y2020i1d10.1057_s41599-020-0412-9.html
   My bibliography  Save this article

Criteria for assessing grant applications: a systematic review

Author

Listed:
  • Sven E. Hug

    (University of Zurich)

  • Mirjam Aeschbach

    (University of Zurich)

Abstract

Criteria are an essential component of any procedure for assessing merit. Yet, little is known about the criteria peers use to assess grant applications. In this systematic review we therefore identify and synthesize studies that examine grant peer review criteria in an empirical and inductive manner. To facilitate the synthesis, we introduce a framework that classifies what is generally referred to as ‘criterion’ into an evaluated entity (i.e., the object of evaluation) and an evaluation criterion (i.e., the dimension along which an entity is evaluated). In total, the synthesis includes 12 studies on grant peer review criteria. Two-thirds of these studies examine criteria in the medical and health sciences, while studies in other fields are scarce. Few studies compare criteria across different fields, and none focus on criteria for interdisciplinary research. We conducted a qualitative content analysis of the 12 studies and thereby identified 15 evaluation criteria and 30 evaluated entities, as well as the relations between them. Based on a network analysis, we determined the following main relations between the identified evaluation criteria and evaluated entities. The aims and outcomes of a proposed project are assessed in terms of the evaluation criteria originality, academic relevance, and extra-academic relevance. The proposed research process is evaluated both on the content level (quality, appropriateness, rigor, coherence/justification), as well as on the level of description (clarity, completeness). The resources needed to implement the research process are evaluated in terms of the evaluation criterion feasibility. Lastly, the person and personality of the applicant are assessed from a ‘psychological’ (motivation, traits) and a ‘sociological’ (diversity) perspective. Furthermore, we find that some of the criteria peers use to evaluate grant applications do not conform to the fairness doctrine and the ideal of impartiality. Grant peer review could therefore be considered unfair and biased. Our findings suggest that future studies on criteria in grant peer review should focus on the applicant, include data from non-Western countries, and examine fields other than the medical and health sciences.

Suggested Citation

  • Sven E. Hug & Mirjam Aeschbach, 2020. "Criteria for assessing grant applications: a systematic review," Palgrave Communications, Palgrave Macmillan, vol. 6(1), pages 1-15, December.
  • Handle: RePEc:pal:palcom:v:6:y:2020:i:1:d:10.1057_s41599-020-0412-9
    DOI: 10.1057/s41599-020-0412-9
    as

    Download full text from publisher

    File URL: http://link.springer.com/10.1057/s41599-020-0412-9
    File Function: Abstract
    Download Restriction: Access to full text is restricted to subscribers.

    File URL: https://libkey.io/10.1057/s41599-020-0412-9?utm_source=ideas
    LibKey link: if access is restricted and if your library uses this service, LibKey will redirect you to where you can use your library subscription to access this item
    ---><---

    As the access to this document is restricted, you may want to search for a different version of it.

    References listed on IDEAS

    as
    1. Hendy Abdoul & Christophe Perrey & Philippe Amiel & Florence Tubach & Serge Gottot & Isabelle Durand-Zaleski & Corinne Alberti, 2012. "Peer Review of Grant Applications: Criteria Used and Qualitative Study of Reviewer Practices," PLOS ONE, Public Library of Science, vol. 7(9), pages 1-15, September.
    2. Jochen Gläser & Grit Laudel, 2005. "Advantages and dangers of ‘remote’ peer evaluation," Research Evaluation, Oxford University Press, vol. 14(3), pages 186-198, December.
    3. Pleun van Arensbergen & Inge van der Weijden & Peter van den Besselaar, 2014. "Different views on scholarly talent: What are the talents we are looking for in science?," Research Evaluation, Oxford University Press, vol. 23(4), pages 273-284.
    4. Lipworth, Wendy L. & Kerridge, Ian H. & Carter, Stacy M. & Little, Miles, 2011. "Journal peer review in context: A qualitative study of the social and subjective dimensions of manuscript review in biomedical publishing," Social Science & Medicine, Elsevier, vol. 72(7), pages 1056-1063, April.
    5. Nees Jan Eck & Ludo Waltman, 2010. "Software survey: VOSviewer, a computer program for bibliometric mapping," Scientometrics, Springer;Akadémiai Kiadó, vol. 84(2), pages 523-538, August.
    6. Christopher W. Belter, 2016. "Citation analysis as a literature search method for systematic reviews," Journal of the Association for Information Science & Technology, Association for Information Science & Technology, vol. 67(11), pages 2766-2777, November.
    7. Carole J. Lee & Cassidy R. Sugimoto & Guo Zhang & Blaise Cronin, 2013. "Bias in peer review," Journal of the American Society for Information Science and Technology, Association for Information Science & Technology, vol. 64(1), pages 2-17, January.
    8. Oecd, 2018. "Effective operation of competitive research funding systems," OECD Science, Technology and Industry Policy Papers 57, OECD Publishing.
    9. Mårtensson, Pär & Fors, Uno & Wallin, Sven-Bertil & Zander, Udo & Nilsson, Gunnar H, 2016. "Evaluating research: A multidisciplinary approach to assessing research practice and quality," Research Policy, Elsevier, vol. 45(3), pages 593-603.
    10. Sven E. Hug & Michael Ochsner & Hans-Dieter Daniel, 2013. "Criteria for assessing research quality in the humanities: a Delphi study among scholars of English literature, German literature and art history," Research Evaluation, Oxford University Press, vol. 22(5), pages 369-383, August.
    11. Peter van den Besselaar & Ulf Sandström & Hélène Schiffbaenker, 2018. "Studying grant decision-making: a linguistic analysis of review reports," Scientometrics, Springer;Akadémiai Kiadó, vol. 117(1), pages 313-329, October.
    12. Dag W. Aksnes & Liv Langfeldt & Paul Wouters, 2019. "Citations, Citation Indicators, and Research Quality: An Overview of Basic Concepts and Theories," SAGE Open, , vol. 9(1), pages 21582440198, February.
    13. repec:nas:journl:v:115:y:2018:p:2952-2957 is not listed on IDEAS
    14. Oortwijn, Wija J. & Vondeling, Hindrik & van Barneveld, Teus & van Vugt, Christel & Bouter, Lex M., 2002. "Priority setting for health technology assessment in The Netherlands: principles and practice," Health Policy, Elsevier, vol. 62(3), pages 227-242, December.
    15. Stevan Harnad, 2009. "Open access scientometrics and the UK Research Assessment Exercise," Scientometrics, Springer;Akadémiai Kiadó, vol. 79(1), pages 147-156, April.
    16. Carole J. Lee & Cassidy R. Sugimoto & Guo Zhang & Blaise Cronin, 2013. "Bias in peer review," Journal of the Association for Information Science & Technology, Association for Information Science & Technology, vol. 64(1), pages 2-17, January.
    17. Martin Reinhart, 2010. "Peer review practices: a content analysis of external reviews in science funding," Research Evaluation, Oxford University Press, vol. 19(5), pages 317-331, December.
    Full references (including those not matched with items on IDEAS)

    Citations

    Citations are extracted by the CitEc Project, subscribe to its RSS feed for this item.
    as


    Cited by:

    1. Song Jing & Qingzhao Ma & Siyi Wang & Hanliang Xu & Tian Xu & Xia Guo & Zhuolin Wu, 2024. "Research on developmental evaluation based on the "four abilities" model: evidence from early career researchers in China," Quality & Quantity: International Journal of Methodology, Springer, vol. 58(1), pages 681-704, February.
    2. Emre Özel, 2024. "What is Gender Bias in Grant Peer review?," Working Papers halshs-03862027, HAL.
    3. Lutz Bornmann & Julian N. Marewski, 2024. "Opium in science and society: numbers and other quantifications," Scientometrics, Springer;Akadémiai Kiadó, vol. 129(9), pages 5313-5346, September.

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Seeber, Marco & Alon, Ilan & Pina, David G. & Piro, Fredrik Niclas & Seeber, Michele, 2022. "Predictors of applying for and winning an ERC Proof-of-Concept grant: An automated machine learning model," Technological Forecasting and Social Change, Elsevier, vol. 184(C).
    2. Dag W. Aksnes & Liv Langfeldt & Paul Wouters, 2019. "Citations, Citation Indicators, and Research Quality: An Overview of Basic Concepts and Theories," SAGE Open, , vol. 9(1), pages 21582440198, February.
    3. Richard R Snell, 2015. "Menage a Quoi? Optimal Number of Peer Reviewers," PLOS ONE, Public Library of Science, vol. 10(4), pages 1-14, April.
    4. Sven Helmer & David B. Blumenthal & Kathrin Paschen, 2020. "What is meaningful research and how should we measure it?," Scientometrics, Springer;Akadémiai Kiadó, vol. 125(1), pages 153-169, October.
    5. Feliciani, Thomas & Morreau, Michael & Luo, Junwen & Lucas, Pablo & Shankar, Kalpana, 2022. "Designing grant-review panels for better funding decisions: Lessons from an empirically calibrated simulation model," Research Policy, Elsevier, vol. 51(4).
    6. Kok, Holmer & Faems, Dries & de Faria, Pedro, 2022. "Pork Barrel or Barrel of Gold? Examining the performance implications of earmarking in public R&D grants," Research Policy, Elsevier, vol. 51(7).
    7. Cruz-Castro, Laura & Sanz-Menendez, Luis, 2021. "What should be rewarded? Gender and evaluation criteria for tenure and promotion," Journal of Informetrics, Elsevier, vol. 15(3).
    8. A. I. M. Jakaria Rahman & Raf Guns & Loet Leydesdorff & Tim C. E. Engels, 2016. "Measuring the match between evaluators and evaluees: cognitive distances between panel members and research groups at the journal level," Scientometrics, Springer;Akadémiai Kiadó, vol. 109(3), pages 1639-1663, December.
    9. Thelwall, Mike & Kousha, Kayvan & Stuart, Emma & Makita, Meiko & Abdoli, Mahshid & Wilson, Paul & Levitt, Jonathan, 2023. "Do bibliometrics introduce gender, institutional or interdisciplinary biases into research evaluations?," Research Policy, Elsevier, vol. 52(8).
    10. Lawrence Smolinsky & Daniel S. Sage & Aaron J. Lercher & Aaron Cao, 2021. "Citations versus expert opinions: citation analysis of featured reviews of the American Mathematical Society," Scientometrics, Springer;Akadémiai Kiadó, vol. 126(5), pages 3853-3870, May.
    11. Rebecca Abma-Schouten & Joey Gijbels & Wendy Reijmerink & Ingeborg Meijer, 2023. "Evaluation of research proposals by peer review panels: broader panels for broader assessments?," Science and Public Policy, Oxford University Press, vol. 50(4), pages 619-632.
    12. Emre Özel, 2024. "What is Gender Bias in Grant Peer review?," Working Papers halshs-03862027, HAL.
    13. Zhichao Wang & Valentin Zelenyuk, 2021. "Performance Analysis of Hospitals in Australia and its Peers: A Systematic Review," CEPA Working Papers Series WP012021, School of Economics, University of Queensland, Australia.
    14. Jürgen Janger & Nicole Schmidt-Padickakudy & Anna Strauss-Kollin, 2019. "International Differences in Basic Research Grant Funding. A Systematic Comparison," WIFO Studies, WIFO, number 61664.
    15. Jin Su & Mo Wang & Mohd Adib Mohammad Razi & Norlida Mohd Dom & Noralfishah Sulaiman & Lai-Wai Tan, 2023. "A Bibliometric Review of Nature-Based Solutions on Urban Stormwater Management," Sustainability, MDPI, vol. 15(9), pages 1-23, April.
    16. Rodríguez Sánchez, Isabel & Makkonen, Teemu & Williams, Allan M., 2019. "Peer review assessment of originality in tourism journals: critical perspective of key gatekeepers," Annals of Tourism Research, Elsevier, vol. 77(C), pages 1-11.
    17. Zhentao Liang & Jin Mao & Gang Li, 2023. "Bias against scientific novelty: A prepublication perspective," Journal of the Association for Information Science & Technology, Association for Information Science & Technology, vol. 74(1), pages 99-114, January.
    18. Elena Veretennik & Maria Yudkevich, 2023. "Inconsistent quality signals: evidence from the regional journals," Scientometrics, Springer;Akadémiai Kiadó, vol. 128(6), pages 3675-3701, June.
    19. Meyer, Matthias & Waldkirch, Rüdiger W. & Duscher, Irina & Just, Alexander, 2018. "Drivers of citations: An analysis of publications in “top” accounting journals," CRITICAL PERSPECTIVES ON ACCOUNTING, Elsevier, vol. 51(C), pages 24-46.
    20. He-Li Sun & Yuan Feng & Qinge Zhang & Jia-Xin Li & Yue-Ying Wang & Zhaohui Su & Teris Cheung & Todd Jackson & Sha Sha & Yu-Tao Xiang, 2022. "The Microbiome–Gut–Brain Axis and Dementia: A Bibliometric Analysis," IJERPH, MDPI, vol. 19(24), pages 1-14, December.

    More about this item

    Statistics

    Access and download statistics

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:pal:palcom:v:6:y:2020:i:1:d:10.1057_s41599-020-0412-9. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: Sonal Shukla or Springer Nature Abstracting and Indexing (email available below). General contact details of provider: https://www.nature.com/ .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.