IDEAS home Printed from https://ideas.repec.org/a/oup/rseval/v32y2024i4p623-634..html
   My bibliography  Save this article

Peer review’s irremediable flaws: Scientists’ perspectives on grant evaluation in Germany

Author

Listed:
  • Eva Barlösius
  • Laura Paruschke
  • Axel Philipps

Abstract

Peer review has developed over time to become the established procedure for assessing and assuring the scientific quality of research. Nevertheless, the procedure has also been variously criticized as conservative, biased, and unfair, among other things. Do scientists regard all these flaws as equally problematic? Do they have the same opinions on which problems are so serious that other selection procedures ought to be considered? The answers to these questions hints at what should be modified in peer review processes as a priority objective. The authors of this paper use survey data to examine how members of the scientific community weight different shortcomings of peer review processes. Which of those processes’ problems do they consider less relevant? Which problems, on the other hand, do they judge to be beyond remedy? Our investigation shows that certain defects of peer review processes are indeed deemed irreparable: (1) legitimate quandaries in the process of fine-tuning the choice between equally eligible research proposals and in the selection of daring ideas; and (2) illegitimate problems due to networks. Science-policy measures to improve peer review processes should therefore clarify the distinction between field-specific remediable and irremediable flaws than is currently the case.

Suggested Citation

  • Eva Barlösius & Laura Paruschke & Axel Philipps, 2024. "Peer review’s irremediable flaws: Scientists’ perspectives on grant evaluation in Germany," Research Evaluation, Oxford University Press, vol. 32(4), pages 623-634.
  • Handle: RePEc:oup:rseval:v:32:y:2024:i:4:p:623-634.
    as

    Download full text from publisher

    File URL: http://hdl.handle.net/10.1093/reseval/rvad032
    Download Restriction: Access to full text is restricted to subscribers.
    ---><---

    As the access to this document is restricted, you may want to search for a different version of it.

    References listed on IDEAS

    as
    1. Axel Philipps, 2021. "Science rules! A qualitative study of scientists’ approaches to grant lottery [The Secret to Germany’s Scientific Excellence]," Research Evaluation, Oxford University Press, vol. 30(1), pages 102-111.
    2. Serge P J M Horbach & Joeri K Tijdink & Lex M Bouter, 2022. "Partial lottery can make grant allocation more fair, more efficient, and more diverse [Mavericks and lotteries]," Science and Public Policy, Oxford University Press, vol. 49(4), pages 580-582.
    3. Thomas Heinze, 2008. "How to sponsor ground-breaking research: A comparison of funding schemes," Science and Public Policy, Oxford University Press, vol. 35(5), pages 302-318, June.
    4. Peter van den Besselaar & Ulf Sandström & Hélène Schiffbaenker, 2018. "Studying grant decision-making: a linguistic analysis of review reports," Scientometrics, Springer;Akadémiai Kiadó, vol. 117(1), pages 313-329, October.
    5. Elise S. Brezis & Aliaksandr Birukou, 2020. "Arbitrariness in the peer review process," Scientometrics, Springer;Akadémiai Kiadó, vol. 123(1), pages 393-411, April.
    6. Baptiste Bedessem, 2020. "Should we fund research randomly? An epistemological criticism of the lottery model as an alternative to peer review for the funding of science," Research Evaluation, Oxford University Press, vol. 29(2), pages 150-157.
    7. Laudel, Grit & Gläser, Jochen, 2014. "Beyond breakthrough research: Epistemic properties of research and their consequences for research funding," Research Policy, Elsevier, vol. 43(7), pages 1204-1216.
    8. Kevin J. Boudreau & Eva C. Guinan & Karim R. Lakhani & Christoph Riedl, 2016. "Looking Across and Looking Beyond the Knowledge Frontier: Intellectual Distance, Novelty, and Resource Allocation in Science," Management Science, INFORMS, vol. 62(10), pages 2765-2783, October.
    Full references (including those not matched with items on IDEAS)

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Eva Barlösius & Kristina Blem, 2021. "Evidence of research mastery: How applicants argue the feasibility of their research projects [Concepts of originality in the natural science, medical, and engineering disciplines: An analysis of r," Research Evaluation, Oxford University Press, vol. 30(4), pages 563-571.
    2. Kok, Holmer & Faems, Dries & de Faria, Pedro, 2022. "Pork Barrel or Barrel of Gold? Examining the performance implications of earmarking in public R&D grants," Research Policy, Elsevier, vol. 51(7).
    3. Axel Philipps, 2022. "Research funding randomly allocated? A survey of scientists’ views on peer review and lottery," Science and Public Policy, Oxford University Press, vol. 49(3), pages 365-377.
    4. Conor O’Kane & Jing A. Zhang & Jarrod Haar & James A. Cunningham, 2023. "How scientists interpret and address funding criteria: value creation and undesirable side effects," Small Business Economics, Springer, vol. 61(2), pages 799-826, August.
    5. Leila Jabrane, 2022. "Individual excellence funding: effects on research autonomy and the creation of protected spaces," Palgrave Communications, Palgrave Macmillan, vol. 9(1), pages 1-9, December.
    6. Jacqueline N. Lane & Misha Teplitskiy & Gary Gray & Hardeep Ranu & Michael Menietti & Eva C. Guinan & Karim R. Lakhani, 2022. "Conservatism Gets Funded? A Field Experiment on the Role of Negative Information in Novel Project Evaluation," Management Science, INFORMS, vol. 68(6), pages 4478-4495, June.
    7. Kwon, Seokbeom, 2022. "Interdisciplinary knowledge integration as a unique knowledge source for technology development and the role of funding allocation," Technological Forecasting and Social Change, Elsevier, vol. 181(C).
    8. Karolin Sjöö & Wolfgang Kaltenbrunner, 2023. "Gender mainstreaming research funding: a study of effects on STEM research proposals," Science and Public Policy, Oxford University Press, vol. 50(2), pages 304-317.
    9. Hren, Darko & Pina, David G. & Norman, Christopher R. & Marušić, Ana, 2022. "What makes or breaks competitive research proposals? A mixed-methods analysis of research grant evaluation reports," Journal of Informetrics, Elsevier, vol. 16(2).
    10. Julian Kolev & Yuly Fuentes-Medel & Fiona Murray, 2019. "Is Blinded Review Enough? How Gendered Outcomes Arise Even Under Anonymous Evaluation," NBER Working Papers 25759, National Bureau of Economic Research, Inc.
    11. Jürgen Janger & Nicole Schmidt & Anna Strauss, 2019. "International Differences in Basic Research Grant Funding. A Systematic Comparison," WIFO Studies, WIFO, number 61664, July.
    12. Jeffrey L. Furman & Florenta Teodoridis, 2020. "Automation, Research Technology, and Researchers’ Trajectories: Evidence from Computer Science and Electrical Engineering," Organization Science, INFORMS, vol. 31(2), pages 330-354, March.
    13. Ke, Qing, 2020. "Technological impact of biomedical research: The role of basicness and novelty," Research Policy, Elsevier, vol. 49(7).
    14. Aurélie Hemonnet-Goujot & Delphine Manceau & Celine Abecassis-Moedas, 2019. "Drivers and Pathways of NPD Success in the Marketing-External Design Relationship," Post-Print hal-01883760, HAL.
    15. Nicolas Carayol & Emeric Henry & Marianne Lanoë, 2020. "Stimulating Peer Effects? Evidence from a Research Cluster Policy," Working Papers hal-03874261, HAL.
    16. Balázs Győrffy & Andrea Magda Nagy & Péter Herman & Ádám Török, 2018. "Factors influencing the scientific performance of Momentum grant holders: an evaluation of the first 117 research groups," Scientometrics, Springer;Akadémiai Kiadó, vol. 117(1), pages 409-426, October.
    17. Jacqueline N. Lane & Ina Ganguli & Patrick Gaule & Eva Guinan & Karim R. Lakhani, 2021. "Engineering serendipity: When does knowledge sharing lead to knowledge production?," Strategic Management Journal, Wiley Blackwell, vol. 42(6), pages 1215-1244, June.
    18. Andrew Shipilov & Frédéric C. Godart & Julien Clement, 2017. "Which boundaries? How mobility networks across countries and status groups affect the creative performance of organizations," Strategic Management Journal, Wiley Blackwell, vol. 38(6), pages 1232-1252, June.
    19. Lawson, Cornelia & Salter, Ammon, 2023. "Exploring the effect of overlapping institutional applications on panel decision-making," Research Policy, Elsevier, vol. 52(9).
    20. Julia Heuritsch, 2023. "The Evaluation Gap in Astronomy—Explained through a Rational Choice Framework," Publications, MDPI, vol. 11(2), pages 1-26, June.

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:oup:rseval:v:32:y:2024:i:4:p:623-634.. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: Oxford University Press (email available below). General contact details of provider: https://academic.oup.com/rev .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.