IDEAS home Printed from https://ideas.repec.org/a/oup/rseval/v32y2024i4p623-634..html
   My bibliography  Save this article

Peer review’s irremediable flaws: Scientists’ perspectives on grant evaluation in Germany

Author

Listed:
  • Eva Barlösius
  • Laura Paruschke
  • Axel Philipps

Abstract

Peer review has developed over time to become the established procedure for assessing and assuring the scientific quality of research. Nevertheless, the procedure has also been variously criticized as conservative, biased, and unfair, among other things. Do scientists regard all these flaws as equally problematic? Do they have the same opinions on which problems are so serious that other selection procedures ought to be considered? The answers to these questions hints at what should be modified in peer review processes as a priority objective. The authors of this paper use survey data to examine how members of the scientific community weight different shortcomings of peer review processes. Which of those processes’ problems do they consider less relevant? Which problems, on the other hand, do they judge to be beyond remedy? Our investigation shows that certain defects of peer review processes are indeed deemed irreparable: (1) legitimate quandaries in the process of fine-tuning the choice between equally eligible research proposals and in the selection of daring ideas; and (2) illegitimate problems due to networks. Science-policy measures to improve peer review processes should therefore clarify the distinction between field-specific remediable and irremediable flaws than is currently the case.

Suggested Citation

  • Eva Barlösius & Laura Paruschke & Axel Philipps, 2024. "Peer review’s irremediable flaws: Scientists’ perspectives on grant evaluation in Germany," Research Evaluation, Oxford University Press, vol. 32(4), pages 623-634.
  • Handle: RePEc:oup:rseval:v:32:y:2024:i:4:p:623-634.
    as

    Download full text from publisher

    File URL: http://hdl.handle.net/10.1093/reseval/rvad032
    Download Restriction: Access to full text is restricted to subscribers.
    ---><---

    As the access to this document is restricted, you may want to search for a different version of it.

    References listed on IDEAS

    as
    1. Axel Philipps, 2021. "Science rules! A qualitative study of scientists’ approaches to grant lottery [The Secret to Germany’s Scientific Excellence]," Research Evaluation, Oxford University Press, vol. 30(1), pages 102-111.
    2. Laudel, Grit & Gläser, Jochen, 2014. "Beyond breakthrough research: Epistemic properties of research and their consequences for research funding," Research Policy, Elsevier, vol. 43(7), pages 1204-1216.
    3. Serge P J M Horbach & Joeri K Tijdink & Lex M Bouter, 2022. "Partial lottery can make grant allocation more fair, more efficient, and more diverse [Mavericks and lotteries]," Science and Public Policy, Oxford University Press, vol. 49(4), pages 580-582.
    4. Kevin J. Boudreau & Eva C. Guinan & Karim R. Lakhani & Christoph Riedl, 2016. "Looking Across and Looking Beyond the Knowledge Frontier: Intellectual Distance, Novelty, and Resource Allocation in Science," Management Science, INFORMS, vol. 62(10), pages 2765-2783, October.
    5. Thomas Heinze, 2008. "How to sponsor ground-breaking research: A comparison of funding schemes," Science and Public Policy, Oxford University Press, vol. 35(5), pages 302-318, June.
    6. Peter van den Besselaar & Ulf Sandström & Hélène Schiffbaenker, 2018. "Studying grant decision-making: a linguistic analysis of review reports," Scientometrics, Springer;Akadémiai Kiadó, vol. 117(1), pages 313-329, October.
    7. Elise S. Brezis & Aliaksandr Birukou, 2020. "Arbitrariness in the peer review process," Scientometrics, Springer;Akadémiai Kiadó, vol. 123(1), pages 393-411, April.
    8. Baptiste Bedessem, 2020. "Should we fund research randomly? An epistemological criticism of the lottery model as an alternative to peer review for the funding of science," Research Evaluation, Oxford University Press, vol. 29(2), pages 150-157.
    Full references (including those not matched with items on IDEAS)

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Eva Barlösius & Kristina Blem, 2021. "Evidence of research mastery: How applicants argue the feasibility of their research projects [Concepts of originality in the natural science, medical, and engineering disciplines: An analysis of r," Research Evaluation, Oxford University Press, vol. 30(4), pages 563-571.
    2. Kok, Holmer & Faems, Dries & de Faria, Pedro, 2022. "Pork Barrel or Barrel of Gold? Examining the performance implications of earmarking in public R&D grants," Research Policy, Elsevier, vol. 51(7).
    3. Axel Philipps, 2022. "Research funding randomly allocated? A survey of scientists’ views on peer review and lottery," Science and Public Policy, Oxford University Press, vol. 49(3), pages 365-377.
    4. Conor O’Kane & Jing A. Zhang & Jarrod Haar & James A. Cunningham, 2023. "How scientists interpret and address funding criteria: value creation and undesirable side effects," Small Business Economics, Springer, vol. 61(2), pages 799-826, August.
    5. Karolin Sjöö & Wolfgang Kaltenbrunner, 2023. "Gender mainstreaming research funding: a study of effects on STEM research proposals," Science and Public Policy, Oxford University Press, vol. 50(2), pages 304-317.
    6. Leila Jabrane, 2022. "Individual excellence funding: effects on research autonomy and the creation of protected spaces," Palgrave Communications, Palgrave Macmillan, vol. 9(1), pages 1-9, December.
    7. Jacqueline N. Lane & Misha Teplitskiy & Gary Gray & Hardeep Ranu & Michael Menietti & Eva C. Guinan & Karim R. Lakhani, 2022. "Conservatism Gets Funded? A Field Experiment on the Role of Negative Information in Novel Project Evaluation," Management Science, INFORMS, vol. 68(6), pages 4478-4495, June.
    8. Kwon, Seokbeom, 2022. "Interdisciplinary knowledge integration as a unique knowledge source for technology development and the role of funding allocation," Technological Forecasting and Social Change, Elsevier, vol. 181(C).
    9. Hren, Darko & Pina, David G. & Norman, Christopher R. & Marušić, Ana, 2022. "What makes or breaks competitive research proposals? A mixed-methods analysis of research grant evaluation reports," Journal of Informetrics, Elsevier, vol. 16(2).
    10. Albert Banal-Estañol & Qianshuo Liu & Inés Macho-Stadler & David Pérez-Castrillo, 2021. "Similar-to-me Effects in the Grant Application Process: Applicants, Panelists, and the Likelihood of Obtaining Funds," Working Papers 1289, Barcelona School of Economics.
    11. Julian Kolev & Yuly Fuentes-Medel & Fiona Murray, 2019. "Is Blinded Review Enough? How Gendered Outcomes Arise Even Under Anonymous Evaluation," NBER Working Papers 25759, National Bureau of Economic Research, Inc.
    12. Jürgen Janger & Nicole Schmidt-Padickakudy & Anna Strauss-Kollin, 2019. "International Differences in Basic Research Grant Funding. A Systematic Comparison," WIFO Studies, WIFO, number 61664, April.
    13. Jeffrey L. Furman & Florenta Teodoridis, 2020. "Automation, Research Technology, and Researchers’ Trajectories: Evidence from Computer Science and Electrical Engineering," Organization Science, INFORMS, vol. 31(2), pages 330-354, March.
    14. Rodríguez Sánchez, Isabel & Makkonen, Teemu & Williams, Allan M., 2019. "Peer review assessment of originality in tourism journals: critical perspective of key gatekeepers," Annals of Tourism Research, Elsevier, vol. 77(C), pages 1-11.
    15. Banal-Estañol, Albert & Macho-Stadler, Inés & Pérez-Castrillo, David, 2019. "Evaluation in research funding agencies: Are structurally diverse teams biased against?," Research Policy, Elsevier, vol. 48(7), pages 1823-1840.
    16. Pierre Azoulay & Danielle Li, 2020. "Scientific Grant Funding," NBER Working Papers 26889, National Bureau of Economic Research, Inc.
    17. Todd Pezzuti & James M. Leonhardt, 2023. "What’s not to like? Negations in brand messages increase consumer engagement," Journal of the Academy of Marketing Science, Springer, vol. 51(3), pages 675-694, May.
    18. Ke, Qing, 2020. "Technological impact of biomedical research: The role of basicness and novelty," Research Policy, Elsevier, vol. 49(7).
    19. Sam Arts & Nicola Melluso & Reinhilde Veugelers, 2023. "Beyond Citations: Measuring Novel Scientific Ideas and their Impact in Publication Text," Papers 2309.16437, arXiv.org, revised Oct 2024.
    20. Aurélie Hemonnet-Goujot & Delphine Manceau & Celine Abecassis-Moedas, 2019. "Drivers and Pathways of NPD Success in the Marketing-External Design Relationship," Post-Print hal-01883760, HAL.

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:oup:rseval:v:32:y:2024:i:4:p:623-634.. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: Oxford University Press (email available below). General contact details of provider: https://academic.oup.com/rev .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.