IDEAS home Printed from https://ideas.repec.org/a/oup/rseval/v32y2024i4p648-657..html
   My bibliography  Save this article

Targeted, actionable and fair: Reviewer reports as feedback and its effect on ECR career choices

Author

Listed:
  • Gemma Elizabeth Derrick
  • Alessandra Zimmermann
  • Helen Greaves
  • Jonathan Best
  • Richard Klavans

Abstract

Previous studies of the use of peer review for the allocation of competitive funding agencies have concentrated on questions of efficiency and how to make the ‘best’ decision, by ensuring that successful applicants are also the more productive or visible in the long term. This paper examines the components of feedback received from an unsuccessful grant application, is associated with motivating applicants career decisions to persist (reapply for funding at T1), or to switch (not to reapply, or else leave academia). This study combined data from interviews with unsuccessful ECR applicants (n = 19) to The Wellcome Trust 2009–19, and manual coding of reviewer comments received by applicants (n = 81). All applicants received feedback on their application at T0 with a large proportion of unsuccessful applicants reapplying for funding at T1. Here, peer-review-comments-as-feedback sends signals to applicants to encourage them to persist (continue) or switch (not continue) even when the initial application has failed. Feedback associated by unsuccessful applicants as motivating their decision to resubmit had three characteristics: actionable; targeted; and fair. The results lead to identification of standards of feedback for funding agencies and peer-reviewers to promote when providing reviewer feedback to applicants as part of their peer review process. The provision of quality reviewer-reports-as-feedback to applicants, ensures that peer review acts as a participatory research governance tool focused on supporting the development of individuals and their future research plans.

Suggested Citation

  • Gemma Elizabeth Derrick & Alessandra Zimmermann & Helen Greaves & Jonathan Best & Richard Klavans, 2024. "Targeted, actionable and fair: Reviewer reports as feedback and its effect on ECR career choices," Research Evaluation, Oxford University Press, vol. 32(4), pages 648-657.
  • Handle: RePEc:oup:rseval:v:32:y:2024:i:4:p:648-657.
    as

    Download full text from publisher

    File URL: http://hdl.handle.net/10.1093/reseval/rvad034
    Download Restriction: Access to full text is restricted to subscribers.
    ---><---

    As the access to this document is restricted, you may want to search for a different version of it.

    References listed on IDEAS

    as
    1. Győrffy, Balázs & Herman, Péter & Szabó, István, 2020. "Research funding: past performance is a stronger predictor of future scientific output than reviewer scores," Journal of Informetrics, Elsevier, vol. 14(3).
    2. Yang Wang & Benjamin F. Jones & Dashun Wang, 2019. "Early-career setback and future career impact," Nature Communications, Nature, vol. 10(1), pages 1-10, December.
    3. van den Besselaar, Peter & Sandström, Ulf, 2015. "Early career grants, performance, and careers: A study on predictive validity of grant decisions," Journal of Informetrics, Elsevier, vol. 9(4), pages 826-838.
    4. Lutz Bornmann & Gerlind Wallon & Anna Ledin, 2008. "Does the Committee Peer Review Select the Best Applicants for Funding? An Investigation of the Selection Process for Two European Molecular Biology Organization Programmes," PLOS ONE, Public Library of Science, vol. 3(10), pages 1-11, October.
    5. Alexander Oettl, 2012. "Reconceptualizing Stars: Scientist Helpfulness and Peer Performance," Management Science, INFORMS, vol. 58(6), pages 1122-1140, June.
    6. Ted von Hippel & Courtney von Hippel, 2015. "To Apply or Not to Apply: A Survey Analysis of Grant Writing Costs and Benefits," PLOS ONE, Public Library of Science, vol. 10(3), pages 1-8, March.
    7. Mark D Lindner & Richard K Nakamura, 2015. "Examining the Predictive Validity of NIH Peer Review Scores," PLOS ONE, Public Library of Science, vol. 10(6), pages 1-12, June.
    8. Carole J. Lee & Cassidy R. Sugimoto & Guo Zhang & Blaise Cronin, 2013. "Bias in peer review," Journal of the American Society for Information Science and Technology, Association for Information Science & Technology, vol. 64(1), pages 2-17, January.
    9. Dennis L Murray & Douglas Morris & Claude Lavoie & Peter R Leavitt & Hugh MacIsaac & Michael E J Masson & Marc-Andre Villard, 2016. "Bias in Research Grant Evaluation Has Dire Consequences for Small Universities," PLOS ONE, Public Library of Science, vol. 11(6), pages 1-19, June.
    10. Carole J. Lee & Cassidy R. Sugimoto & Guo Zhang & Blaise Cronin, 2013. "Bias in peer review," Journal of the Association for Information Science & Technology, Association for Information Science & Technology, vol. 64(1), pages 2-17, January.
    Full references (including those not matched with items on IDEAS)

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Kevin W. Boyack & Caleb Smith & Richard Klavans, 2018. "Toward predicting research proposal success," Scientometrics, Springer;Akadémiai Kiadó, vol. 114(2), pages 449-461, February.
    2. Seeber, Marco & Alon, Ilan & Pina, David G. & Piro, Fredrik Niclas & Seeber, Michele, 2022. "Predictors of applying for and winning an ERC Proof-of-Concept grant: An automated machine learning model," Technological Forecasting and Social Change, Elsevier, vol. 184(C).
    3. Kok, Holmer & Faems, Dries & de Faria, Pedro, 2022. "Pork Barrel or Barrel of Gold? Examining the performance implications of earmarking in public R&D grants," Research Policy, Elsevier, vol. 51(7).
    4. Dennis L Murray & Douglas Morris & Claude Lavoie & Peter R Leavitt & Hugh MacIsaac & Michael E J Masson & Marc-Andre Villard, 2016. "Bias in Research Grant Evaluation Has Dire Consequences for Small Universities," PLOS ONE, Public Library of Science, vol. 11(6), pages 1-19, June.
    5. Gaëlle Vallée-Tourangeau & Ana Wheelock & Tushna Vandrevala & Priscilla Harries, 2022. "Peer reviewers’ dilemmas: a qualitative exploration of decisional conflict in the evaluation of grant applications in the medical humanities and social sciences," Palgrave Communications, Palgrave Macmillan, vol. 9(1), pages 1-11, December.
    6. Richard R Snell, 2015. "Menage a Quoi? Optimal Number of Peer Reviewers," PLOS ONE, Public Library of Science, vol. 10(4), pages 1-14, April.
    7. Jürgen Janger & Nicole Schmidt & Anna Strauss, 2019. "International Differences in Basic Research Grant Funding. A Systematic Comparison," WIFO Studies, WIFO, number 61664.
    8. Marco Cozzi, 2020. "Public Funding of Research and Grant Proposals in the Social Sciences: Empirical Evidence from Canada," Department Discussion Papers 1809, Department of Economics, University of Victoria.
    9. Rodríguez Sánchez, Isabel & Makkonen, Teemu & Williams, Allan M., 2019. "Peer review assessment of originality in tourism journals: critical perspective of key gatekeepers," Annals of Tourism Research, Elsevier, vol. 77(C), pages 1-11.
    10. Li, Heyang & Wu, Meijun & Wang, Yougui & Zeng, An, 2022. "Bibliographic coupling networks reveal the advantage of diversification in scientific projects," Journal of Informetrics, Elsevier, vol. 16(3).
    11. Zhentao Liang & Jin Mao & Gang Li, 2023. "Bias against scientific novelty: A prepublication perspective," Journal of the Association for Information Science & Technology, Association for Information Science & Technology, vol. 74(1), pages 99-114, January.
    12. Elena Veretennik & Maria Yudkevich, 2023. "Inconsistent quality signals: evidence from the regional journals," Scientometrics, Springer;Akadémiai Kiadó, vol. 128(6), pages 3675-3701, June.
    13. Meyer, Matthias & Waldkirch, Rüdiger W. & Duscher, Irina & Just, Alexander, 2018. "Drivers of citations: An analysis of publications in “top” accounting journals," CRITICAL PERSPECTIVES ON ACCOUNTING, Elsevier, vol. 51(C), pages 24-46.
    14. Feliciani, Thomas & Morreau, Michael & Luo, Junwen & Lucas, Pablo & Shankar, Kalpana, 2022. "Designing grant-review panels for better funding decisions: Lessons from an empirically calibrated simulation model," Research Policy, Elsevier, vol. 51(4).
    15. David Card & Stefano DellaVigna, 2017. "What do Editors Maximize? Evidence from Four Leading Economics Journals," NBER Working Papers 23282, National Bureau of Economic Research, Inc.
    16. J. A. García & Rosa Rodriguez-Sánchez & J. Fdez-Valdivia, 2016. "Why the referees’ reports I receive as an editor are so much better than the reports I receive as an author?," Scientometrics, Springer;Akadémiai Kiadó, vol. 106(3), pages 967-986, March.
    17. Dietmar Wolfram & Peiling Wang & Adam Hembree & Hyoungjoo Park, 2020. "Open peer review: promoting transparency in open science," Scientometrics, Springer;Akadémiai Kiadó, vol. 125(2), pages 1033-1051, November.
    18. Andrada Elena Urda-Cîmpean & Sorana D. Bolboacă & Andrei Achimaş-Cadariu & Tudor Cătălin Drugan, 2016. "Knowledge Production in Two Types of Medical PhD Routes—What’s to Gain?," Publications, MDPI, vol. 4(2), pages 1-16, June.
    19. Oleksiyenko, Anatoly V., 2023. "Geopolitical agendas and internationalization of post-soviet higher education: Discursive dilemmas in the realm of the prestige economy," International Journal of Educational Development, Elsevier, vol. 102(C).
    20. Rosa Rodriguez-Sánchez & J. A. García & J. Fdez-Valdivia, 2018. "Editorial decisions with informed and uninformed reviewers," Scientometrics, Springer;Akadémiai Kiadó, vol. 117(1), pages 25-43, October.

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:oup:rseval:v:32:y:2024:i:4:p:648-657.. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: Oxford University Press (email available below). General contact details of provider: https://academic.oup.com/rev .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.