IDEAS home Printed from https://ideas.repec.org/a/plo/pone00/0126938.html
   My bibliography  Save this article

Examining the Predictive Validity of NIH Peer Review Scores

Author

Listed:
  • Mark D Lindner
  • Richard K Nakamura

Abstract

The predictive validity of peer review at the National Institutes of Health (NIH) has not yet been demonstrated empirically. It might be assumed that the most efficient and expedient test of the predictive validity of NIH peer review would be an examination of the correlation between percentile scores from peer review and bibliometric indices of the publications produced from funded projects. The present study used a large dataset to examine the rationale for such a study, to determine if it would satisfy the requirements for a test of predictive validity. The results show significant restriction of range in the applications selected for funding. Furthermore, those few applications that are funded with slightly worse peer review scores are not selected at random or representative of other applications in the same range. The funding institutes also negotiate with applicants to address issues identified during peer review. Therefore, the peer review scores assigned to the submitted applications, especially for those few funded applications with slightly worse peer review scores, do not reflect the changed and improved projects that are eventually funded. In addition, citation metrics by themselves are not valid or appropriate measures of scientific impact. The use of bibliometric indices on their own to measure scientific impact would likely increase the inefficiencies and problems with replicability already largely attributed to the current over-emphasis on bibliometric indices. Therefore, retrospective analyses of the correlation between percentile scores from peer review and bibliometric indices of the publications resulting from funded grant applications are not valid tests of the predictive validity of peer review at the NIH.

Suggested Citation

  • Mark D Lindner & Richard K Nakamura, 2015. "Examining the Predictive Validity of NIH Peer Review Scores," PLOS ONE, Public Library of Science, vol. 10(6), pages 1-12, June.
  • Handle: RePEc:plo:pone00:0126938
    DOI: 10.1371/journal.pone.0126938
    as

    Download full text from publisher

    File URL: https://journals.plos.org/plosone/article?id=10.1371/journal.pone.0126938
    Download Restriction: no

    File URL: https://journals.plos.org/plosone/article/file?id=10.1371/journal.pone.0126938&type=printable
    Download Restriction: no

    File URL: https://libkey.io/10.1371/journal.pone.0126938?utm_source=ideas
    LibKey link: if access is restricted and if your library uses this service, LibKey will redirect you to where you can use your library subscription to access this item
    ---><---

    References listed on IDEAS

    as
    1. John P A Ioannidis, 2005. "Why Most Published Research Findings Are False," PLOS Medicine, Public Library of Science, vol. 2(8), pages 1-1, August.
    2. Alison Abbott & David Cyranoski & Nicola Jones & Brendan Maher & Quirin Schiermeier & Richard Van Noorden, 2010. "Metrics: Do metrics matter?," Nature, Nature, vol. 465(7300), pages 860-862, June.
    3. Terrence A. Brooks, 1985. "Private acts and public objects: An investigation of citer motivations," Journal of the American Society for Information Science, Association for Information Science & Technology, vol. 36(4), pages 223-229, July.
    4. McMillan, G. Steven & Narin, Francis & Deeds, David L., 2000. "An analysis of the critical role of public science in innovation: the case of biotechnology," Research Policy, Elsevier, vol. 29(1), pages 1-8, January.
    5. C. Glenn Begley & Lee M. Ellis, 2012. "Raise standards for preclinical cancer research," Nature, Nature, vol. 483(7391), pages 531-533, March.
    6. Hendrik P. van Dalen & Kène Henkens, 2012. "Intended and unintended consequences of a publish‐or‐perish culture: A worldwide survey," Journal of the American Society for Information Science and Technology, Association for Information Science & Technology, vol. 63(7), pages 1282-1293, July.
    7. Daniele Fanelli, 2010. "Do Pressures to Publish Increase Scientists' Bias? An Empirical Support from US States Data," PLOS ONE, Public Library of Science, vol. 5(4), pages 1-7, April.
    8. Katz, David A, 1973. "Faculty Salaries, Promotion, and Productivity at a Large University," American Economic Review, American Economic Association, vol. 63(3), pages 469-477, June.
    9. Stuart Macdonald & Jacqueline Kam, 2007. "Ring a Ring o’ Roses: Quality Journals and Gamesmanship in Management Studies," Journal of Management Studies, Wiley Blackwell, vol. 44(4), pages 640-655, June.
    10. James S. Fairweather, 2005. "Beyond the Rhetoric: Trends in the Relative Value of Teaching and Research in Faculty Salaries," The Journal of Higher Education, Taylor & Francis Journals, vol. 76(4), pages 401-422, July.
    11. Peter A. Lawrence, 2003. "The politics of publication," Nature, Nature, vol. 422(6929), pages 259-261, March.
    12. Ted I. K. Youn & Tanya M. Price, 2009. "Learning from the Experience of Others: The Evolution of Faculty Tenure and Promotion Rules in Comprehensive Institutions," The Journal of Higher Education, Taylor & Francis Journals, vol. 80(2), pages 204-237, March.
    13. Daniele Fanelli, 2012. "Negative results are disappearing from most disciplines and countries," Scientometrics, Springer;Akadémiai Kiadó, vol. 90(3), pages 891-904, March.
    14. Neal S Young, 2008. "Why Current Publication May Distort Science," Working Papers id:1757, eSocialSciences.
    15. repec:bla:germec:v:9:y:2008:i::p:457-472 is not listed on IDEAS
    16. Neal S Young & John P A Ioannidis & Omar Al-Ubaydli, 2008. "Why Current Publication Practices May Distort Science," PLOS Medicine, Public Library of Science, vol. 5(10), pages 1-5, October.
    17. Moore, William J & Newman, Robert J & Turnbull, Geoffrey K, 1998. "Do Academic Salaries Decline with Seniority?," Journal of Labor Economics, University of Chicago Press, vol. 16(2), pages 352-366, April.
    18. Michael Graber & Andrey Launov & Klaus Wälde, 2008. "Publish or Perish? The Increasing Importance of Publications for Prospective Economics Professors in Austria, Germany and Switzerland," German Economic Review, Verein für Socialpolitik, vol. 9(4), pages 457-472, November.
    19. Moore, William J & Newman, Robert J & Turnbull, Geoffrey K, 2001. "Reputational Capital and Academic Pay," Economic Inquiry, Western Economic Association International, vol. 39(4), pages 663-671, October.
    Full references (including those not matched with items on IDEAS)

    Citations

    Citations are extracted by the CitEc Project, subscribe to its RSS feed for this item.
    as


    Cited by:

    1. Elena A. Erosheva & Patrícia Martinková & Carole J. Lee, 2021. "When zero may not be zero: A cautionary note on the use of inter‐rater reliability in evaluating grant peer review," Journal of the Royal Statistical Society Series A, Royal Statistical Society, vol. 184(3), pages 904-919, July.
    2. Kevin W. Boyack & Caleb Smith & Richard Klavans, 2018. "Toward predicting research proposal success," Scientometrics, Springer;Akadémiai Kiadó, vol. 114(2), pages 449-461, February.
    3. Gemma Elizabeth Derrick & Alessandra Zimmermann & Helen Greaves & Jonathan Best & Richard Klavans, 2024. "Targeted, actionable and fair: Reviewer reports as feedback and its effect on ECR career choices," Research Evaluation, Oxford University Press, vol. 32(4), pages 648-657.

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Joeri K Tijdink & Anton C M Vergouwen & Yvo M Smulders, 2013. "Publication Pressure and Burn Out among Dutch Medical Professors: A Nationwide Survey," PLOS ONE, Public Library of Science, vol. 8(9), pages 1-6, September.
    2. Omar Al-Ubaydli & John A. List, 2015. "Do Natural Field Experiments Afford Researchers More or Less Control than Laboratory Experiments? A Simple Model," NBER Working Papers 20877, National Bureau of Economic Research, Inc.
    3. Oliver Braganza, 2020. "A simple model suggesting economically rational sample-size choice drives irreproducibility," PLOS ONE, Public Library of Science, vol. 15(3), pages 1-19, March.
    4. Daniele Fanelli & Rodrigo Costas & Vincent Larivière, 2015. "Misconduct Policies, Academic Culture and Career Stage, Not Gender or Pressures to Publish, Affect Scientific Integrity," PLOS ONE, Public Library of Science, vol. 10(6), pages 1-18, June.
    5. Ilya Prakhov & Victor Rudakov, 2018. "The Determinants of Faculty Pay in Russian Universities: Incentive Contracts," HSE Working papers WP BRP 47/EDU/2018, National Research University Higher School of Economics.
    6. Mangirdas Morkunas & Elzė Rudienė & Lukas Giriūnas & Laura Daučiūnienė, 2020. "Assessment of Factors Causing Bias in Marketing- Related Publications," Publications, MDPI, vol. 8(4), pages 1-16, October.
    7. Michael J. Hilmer & Michael R. Ransom & Christiana E. Hilmer, 2015. "Fame and the fortune of academic economists: How the market rewards influential research in economics," Southern Economic Journal, John Wiley & Sons, vol. 82(2), pages 430-452, October.
    8. van Aert, Robbie Cornelis Maria, 2018. "Dissertation R.C.M. van Aert," MetaArXiv eqhjd, Center for Open Science.
    9. Daniele Fanelli & Vincent Larivière, 2016. "Researchers’ Individual Publication Rate Has Not Increased in a Century," PLOS ONE, Public Library of Science, vol. 11(3), pages 1-12, March.
    10. Robbie C M van Aert & Marcel A L M van Assen, 2017. "Bayesian evaluation of effect size after replicating an original study," PLOS ONE, Public Library of Science, vol. 12(4), pages 1-23, April.
    11. Omar Al-Ubaydli & John List & Claire Mackevicius & Min Sok Lee & Dana Suskind, 2019. "How Can Experiments Play a Greater Role in Public Policy? 12 Proposals from an Economic Model of Scaling," Artefactual Field Experiments 00679, The Field Experiments Website.
    12. Karin Langenkamp & Bodo Rödel & Kerstin Taufenbach & Meike Weiland, 2018. "Open Access in Vocational Education and Training Research," Publications, MDPI, vol. 6(3), pages 1-12, July.
    13. Frederique Bordignon, 2020. "Self-correction of science: a comparative study of negative citations and post-publication peer review," Scientometrics, Springer;Akadémiai Kiadó, vol. 124(2), pages 1225-1239, August.
    14. Dell'Anno, Roberto & Caferra, Rocco & Morone, Andrea, 2020. "A “Trojan Horse” in the peer-review process of fee-charging economic journals," Journal of Informetrics, Elsevier, vol. 14(3).
    15. Kirmayer, Laurence J., 2012. "Cultural competence and evidence-based practice in mental health: Epistemic communities and the politics of pluralism," Social Science & Medicine, Elsevier, vol. 75(2), pages 249-256.
    16. Colin F. Camerer & Anna Dreber & Felix Holzmeister & Teck-Hua Ho & Jürgen Huber & Magnus Johannesson & Michael Kirchler & Gideon Nave & Brian A. Nosek & Thomas Pfeiffer & Adam Altmejd & Nick Buttrick , 2018. "Evaluating the replicability of social science experiments in Nature and Science between 2010 and 2015," Nature Human Behaviour, Nature, vol. 2(9), pages 637-644, September.
    17. Daniele Fanelli, 2012. "Negative results are disappearing from most disciplines and countries," Scientometrics, Springer;Akadémiai Kiadó, vol. 90(3), pages 891-904, March.
    18. John Gibson & David L. Anderson & John Tressler, 2017. "Citations Or Journal Quality: Which Is Rewarded More In The Academic Labor Market?," Economic Inquiry, Western Economic Association International, vol. 55(4), pages 1945-1965, October.
    19. Brian Fabo & Martina Jancokova & Elisabeth Kempf & Lubos Pastor, 2020. "Fifty Shades of QE: Conflicts of Interest in Economic Research," Working and Discussion Papers WP 5/2020, Research Department, National Bank of Slovakia.
    20. Bettina Bert & Céline Heinl & Justyna Chmielewska & Franziska Schwarz & Barbara Grune & Andreas Hensel & Matthias Greiner & Gilbert Schönfelder, 2019. "Refining animal research: The Animal Study Registry," PLOS Biology, Public Library of Science, vol. 17(10), pages 1-12, October.

    More about this item

    Statistics

    Access and download statistics

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:plo:pone00:0126938. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: plosone (email available below). General contact details of provider: https://journals.plos.org/plosone/ .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.