IDEAS home Printed from https://ideas.repec.org/a/spr/envsyd/v38y2018i2d10.1007_s10669-018-9677-6.html
   My bibliography  Save this article

Risk evaluation in peer review of grant applications

Author

Listed:
  • Stephen Gallo

    (American Institute of Biological Sciences, Scientific Peer Advisory and Review Services)

  • Lisa Thompson

    (American Institute of Biological Sciences, Scientific Peer Advisory and Review Services)

  • Karen Schmaling

    (Washington State University)

  • Scott Glisson

    (American Institute of Biological Sciences, Scientific Peer Advisory and Review Services)

Abstract

The process of peer review is used to identify the most scientifically meritorious research projects for funding. Impact and innovation are among the criteria used to determine overall merit. A criticism of peer review has been the perception that reviewers are biased against innovation, such as one study that found reviewers to systematically assign poorer scores to highly novel work. Moreover, reviewers’ definitions for excellent research and paradigm-shifting research are different; innovative research may not always be considered excellent. Therefore, it is clear more needs to be done to understand the decision-making processes of reviewers in evaluating risk and innovation in research. In an effort to address this gap, the American Institute of Biological Sciences developed a comprehensive peer review survey that examined, in part, the differences in applicant and reviewer perceptions of review outcomes. The survey was disseminated to 13,091 reviewers and applicants, of whom 9.4% responded. Only 24% of respondent applicants indicated that innovation was addressed in their review feedback, while 81% of respondent reviewers indicated they factored innovation into selecting the best science and 73% viewed innovation as an essential component of scientific excellence. Similarly, while only 27% of respondent applicants reported receiving comments on the riskiness of their grant applications, 58% of respondent reviewers indicated that the risks associated with innovative research impacted the scores they assigned to the grant applications. These results indicate a potential source of bias in how innovation and risk are evaluated in grant applications.

Suggested Citation

  • Stephen Gallo & Lisa Thompson & Karen Schmaling & Scott Glisson, 2018. "Risk evaluation in peer review of grant applications," Environment Systems and Decisions, Springer, vol. 38(2), pages 216-229, June.
  • Handle: RePEc:spr:envsyd:v:38:y:2018:i:2:d:10.1007_s10669-018-9677-6
    DOI: 10.1007/s10669-018-9677-6
    as

    Download full text from publisher

    File URL: http://link.springer.com/10.1007/s10669-018-9677-6
    File Function: Abstract
    Download Restriction: Access to the full text of the articles in this series is restricted.

    File URL: https://libkey.io/10.1007/s10669-018-9677-6?utm_source=ideas
    LibKey link: if access is restricted and if your library uses this service, LibKey will redirect you to where you can use your library subscription to access this item
    ---><---

    As the access to this document is restricted, you may want to search for a different version of it.

    References listed on IDEAS

    as
    1. Terttu Luukkonen, 2012. "Conservatism and risk-taking in peer review: Emerging ERC practices," Research Evaluation, Oxford University Press, vol. 21(1), pages 48-60, February.
    2. John P A Ioannidis, 2005. "Why Most Published Research Findings Are False," PLOS Medicine, Public Library of Science, vol. 2(8), pages 1-1, August.
    3. John P. A. Ioannidis, 2011. "Fund people not projects," Nature, Nature, vol. 477(7366), pages 529-531, September.
    4. Kevin J. Boudreau & Eva C. Guinan & Karim R. Lakhani & Christoph Riedl, 2016. "Looking Across and Looking Beyond the Knowledge Frontier: Intellectual Distance, Novelty, and Resource Allocation in Science," Management Science, INFORMS, vol. 62(10), pages 2765-2783, October.
    5. Paul Slovic, 1999. "Trust, Emotion, Sex, Politics, and Science: Surveying the Risk‐Assessment Battlefield," Risk Analysis, John Wiley & Sons, vol. 19(4), pages 689-701, August.
    6. Stephen A Gallo & Joanne H Sullivan & Scott R Glisson, 2016. "The Influence of Peer Reviewer Expertise on the Evaluation of Research Funding Applications," PLOS ONE, Public Library of Science, vol. 11(10), pages 1-18, October.
    Full references (including those not matched with items on IDEAS)

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Banal-Estañol, Albert & Macho-Stadler, Inés & Pérez-Castrillo, David, 2019. "Evaluation in research funding agencies: Are structurally diverse teams biased against?," Research Policy, Elsevier, vol. 48(7), pages 1823-1840.
    2. Conor O’Kane & Jing A. Zhang & Jarrod Haar & James A. Cunningham, 2023. "How scientists interpret and address funding criteria: value creation and undesirable side effects," Small Business Economics, Springer, vol. 61(2), pages 799-826, August.
    3. Albert Banal-Estañol & Ines Macho-Stadler & David Pérez-Castrillo, 2016. "Key Success Drivers in Public Research Grants: Funding the Seeds of Radical Innovation in Academia?," CESifo Working Paper Series 5852, CESifo.
    4. Nicolas Carayol, 2016. "The Right Job and the Job Right: Novelty, Impact and Journal Stratification in Science," Post-Print hal-02274661, HAL.
    5. Charles Ayoubi & Michele Pezzoni & Fabiana Visentin, 2021. "Does It Pay to Do Novel Science? The Selectivity Patterns in Science Funding," Science and Public Policy, Oxford University Press, vol. 48(5), pages 635-648.
    6. Kok, Holmer & Faems, Dries & de Faria, Pedro, 2022. "Pork Barrel or Barrel of Gold? Examining the performance implications of earmarking in public R&D grants," Research Policy, Elsevier, vol. 51(7).
    7. Dag W. Aksnes & Liv Langfeldt & Paul Wouters, 2019. "Citations, Citation Indicators, and Research Quality: An Overview of Basic Concepts and Theories," SAGE Open, , vol. 9(1), pages 21582440198, February.
    8. Carole J. Lee & Cassidy R. Sugimoto & Guo Zhang & Blaise Cronin, 2013. "Bias in peer review," Journal of the Association for Information Science & Technology, Association for Information Science & Technology, vol. 64(1), pages 2-17, January.
    9. Wang, Jian & Lee, You-Na & Walsh, John P., 2018. "Funding model and creativity in science: Competitive versus block funding and status contingency effects," Research Policy, Elsevier, vol. 47(6), pages 1070-1083.
    10. Elise S. Brezis & Aliaksandr Birukou, 2020. "Arbitrariness in the peer review process," Scientometrics, Springer;Akadémiai Kiadó, vol. 123(1), pages 393-411, April.
    11. Jacqueline N. Lane & Misha Teplitskiy & Gary Gray & Hardeep Ranu & Michael Menietti & Eva C. Guinan & Karim R. Lakhani, 2022. "Conservatism Gets Funded? A Field Experiment on the Role of Negative Information in Novel Project Evaluation," Management Science, INFORMS, vol. 68(6), pages 4478-4495, June.
    12. Axel Philipps, 2022. "Research funding randomly allocated? A survey of scientists’ views on peer review and lottery," Science and Public Policy, Oxford University Press, vol. 49(3), pages 365-377.
    13. Miguel Navascués & Costantino Budroni, 2019. "Theoretical research without projects," PLOS ONE, Public Library of Science, vol. 14(3), pages 1-35, March.
    14. Ellgen, Clifford & Kang, Dominique, 2021. "Research equity: Incentivizing high-risk basic research with market mechanisms," SocArXiv cvngq, Center for Open Science.
    15. Gerald Schweiger & Adrian Barnett & Peter van den Besselaar & Lutz Bornmann & Andreas De Block & John P. A. Ioannidis & Ulf Sandstrom & Stijn Conix, 2024. "The Costs of Competition in Distributing Scarce Research Funds," Papers 2403.16934, arXiv.org.
    16. Joshua Krieger & Ramana Nanda & Ian Hunt & Aimee Reynolds & Peter Tarsa, 2022. "Scoring and Funding Breakthrough Ideas: Evidence from a Global Pharmaceutical Company," Harvard Business School Working Papers 23-014, Harvard Business School, revised Nov 2023.
    17. Roxanne E. Lewis & Michael G. Tyshenko, 2009. "The Impact of Social Amplification and Attenuation of Risk and the Public Reaction to Mad Cow Disease in Canada," Risk Analysis, John Wiley & Sons, vol. 29(5), pages 714-728, May.
    18. Alexander Frankel & Maximilian Kasy, 2022. "Which Findings Should Be Published?," American Economic Journal: Microeconomics, American Economic Association, vol. 14(1), pages 1-38, February.
    19. Jyotirmoy Sarkar, 2018. "Will P†Value Triumph over Abuses and Attacks?," Biostatistics and Biometrics Open Access Journal, Juniper Publishers Inc., vol. 7(4), pages 66-71, July.
    20. Albert Banal-Estañol & Qianshuo Liu & Inés Macho-Stadler & David Pérez-Castrillo, 2021. "Similar-to-me Effects in the Grant Application Process: Applicants, Panelists, and the Likelihood of Obtaining Funds," Working Papers 1289, Barcelona School of Economics.

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:spr:envsyd:v:38:y:2018:i:2:d:10.1007_s10669-018-9677-6. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: Sonal Shukla or Springer Nature Abstracting and Indexing (email available below). General contact details of provider: http://www.springer.com .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.