IDEAS home Printed from https://ideas.repec.org/a/eee/infome/v4y2010i3p211-220.html
   My bibliography  Save this article

A meta-evaluation of scientific research proposals: Different ways of comparing rejected to awarded applications

Author

Listed:
  • Bornmann, Lutz
  • Leydesdorff, Loet
  • Van den Besselaar, Peter

Abstract

Combining different data sets with information on grant and fellowship applications submitted to two renowned funding agencies, we are able to compare their funding decisions (award and rejection) with scientometric performance indicators across two fields of science (life sciences and social sciences). The data sets involve 671 applications in social sciences and 668 applications in life sciences. In both fields, awarded applicants perform on average better than all rejected applicants. If only the most preeminent rejected applicants are considered in both fields, they score better than the awardees on citation impact. With regard to productivity we find differences between the fields. While the awardees in life sciences outperform on average the most preeminent rejected applicants, the situation is reversed in social sciences.

Suggested Citation

  • Bornmann, Lutz & Leydesdorff, Loet & Van den Besselaar, Peter, 2010. "A meta-evaluation of scientific research proposals: Different ways of comparing rejected to awarded applications," Journal of Informetrics, Elsevier, vol. 4(3), pages 211-220.
  • Handle: RePEc:eee:infome:v:4:y:2010:i:3:p:211-220
    DOI: 10.1016/j.joi.2009.10.004
    as

    Download full text from publisher

    File URL: http://www.sciencedirect.com/science/article/pii/S1751157709000789
    Download Restriction: Full text for ScienceDirect subscribers only

    File URL: https://libkey.io/10.1016/j.joi.2009.10.004?utm_source=ideas
    LibKey link: if access is restricted and if your library uses this service, LibKey will redirect you to where you can use your library subscription to access this item
    ---><---

    As the access to this document is restricted, you may want to search for a different version of it.

    References listed on IDEAS

    as
    1. Lutz Bornmann & Rüdiger Mutz & Hans‐Dieter Daniel, 2008. "Are there better indices for evaluation purposes than the h index? A comparison of nine different variants of the h index using data from biomedicine," Journal of the American Society for Information Science and Technology, Association for Information Science & Technology, vol. 59(5), pages 830-837, March.
    2. Lutz Bornmann & Gerlind Wallon & Anna Ledin, 2008. "Does the Committee Peer Review Select the Best Applicants for Funding? An Investigation of the Selection Process for Two European Molecular Biology Organization Programmes," PLOS ONE, Public Library of Science, vol. 3(10), pages 1-11, October.
    3. Moed, H. F. & Burger, W. J. M. & Frankfort, J. G. & Van Raan, A. F. J., 1985. "The use of bibliometric data for the measurement of university research performance," Research Policy, Elsevier, vol. 14(3), pages 131-149, June.
    4. Lutz Bornmann & Gerlind Wallon & Anna Ledin, 2008. "Is the h index related to (standard) bibliometric measures and to the assessments by peers? An investigation of the h index by using molecular life sciences data," Research Evaluation, Oxford University Press, vol. 17(2), pages 149-156, June.
    5. Bornmann, Lutz & Daniel, Hans-Dieter, 2007. "Convergent validation of peer review decisions using the h index," Journal of Informetrics, Elsevier, vol. 1(3), pages 204-213.
    6. Martin, Ben R. & Irvine, John, 1993. "Assessing basic research : Some partial indicators of scientific progress in radio astronomy," Research Policy, Elsevier, vol. 22(2), pages 106-106, April.
    7. Hausman, Jerry & Hall, Bronwyn H & Griliches, Zvi, 1984. "Econometric Models for Count Data with an Application to the Patents-R&D Relationship," Econometrica, Econometric Society, vol. 52(4), pages 909-938, July.
    8. Göran Melin & Rickard Danell, 2006. "The top eight percent: Development of approved and rejected applicants for a prestigious grant in Sweden," Science and Public Policy, Oxford University Press, vol. 33(10), pages 702-712, December.
    9. Peter van den Besselaar & Loet Leydesdorff, 2009. "Past performance, peer review and project selection: a case study in the social and behavioral sciences," Research Evaluation, Oxford University Press, vol. 18(4), pages 273-288, October.
    10. J. Scott Long & Jeremy Freese, 2006. "Regression Models for Categorical Dependent Variables using Stata, 2nd Edition," Stata Press books, StataCorp LP, edition 2, number long2, March.
    11. Lutz Bornmann & Hans‐Dieter Daniel, 2007. "Multiple publication on a single research study: Does it pay? The influence of number of research articles on total citation counts in biomedicine," Journal of the American Society for Information Science and Technology, Association for Information Science & Technology, vol. 58(8), pages 1100-1107, June.
    Full references (including those not matched with items on IDEAS)

    Citations

    Citations are extracted by the CitEc Project, subscribe to its RSS feed for this item.
    as


    Cited by:

    1. Frank Rijnsoever & Leon Welle & Sjoerd Bakker, 2014. "Credibility and legitimacy in policy-driven innovation networks: resource dependencies and expectations in Dutch electric vehicle subsidies," The Journal of Technology Transfer, Springer, vol. 39(4), pages 635-661, August.
    2. Daniele Rotolo & Michael Hopkins & Nicola Grassano, 2023. "Do funding sources complement or substitute? Examining the impact of cancer research publications," Journal of the Association for Information Science & Technology, Association for Information Science & Technology, vol. 74(1), pages 50-66, January.
    3. Wang, Jian & Lee, You-Na & Walsh, John P., 2018. "Funding model and creativity in science: Competitive versus block funding and status contingency effects," Research Policy, Elsevier, vol. 47(6), pages 1070-1083.
    4. Sander Gerritsen & Karen van der Wiel & Erik Plug, 2013. "Up or out? How individual research grants affect academic careers in the Netherlands," CPB Discussion Paper 249.rdf, CPB Netherlands Bureau for Economic Policy Analysis.
    5. Gerald Schweiger & Adrian Barnett & Peter van den Besselaar & Lutz Bornmann & Andreas De Block & John P. A. Ioannidis & Ulf Sandstrom & Stijn Conix, 2024. "The Costs of Competition in Distributing Scarce Research Funds," Papers 2403.16934, arXiv.org.
    6. Kevin W. Boyack & Caleb Smith & Richard Klavans, 2018. "Toward predicting research proposal success," Scientometrics, Springer;Akadémiai Kiadó, vol. 114(2), pages 449-461, February.
    7. Adriana Bin & Sergio Salles-Filho & Ana Carolina Spatti & Jesús Pascual Mena-Chalco & Fernando Antonio Basile Colugnati, 2022. "How much does a Ph.D. scholarship program impact an emerging economy research performance?," Scientometrics, Springer;Akadémiai Kiadó, vol. 127(12), pages 6935-6960, December.
    8. van den Besselaar, Peter & Sandström, Ulf, 2015. "Early career grants, performance, and careers: A study on predictive validity of grant decisions," Journal of Informetrics, Elsevier, vol. 9(4), pages 826-838.
    9. Buehling, Kilian, 2021. "Changing research topic trends as an effect of publication rankings – The case of German economists and the Handelsblatt Ranking," Journal of Informetrics, Elsevier, vol. 15(3).
    10. Jun-Ying Fu & Xu Zhang & Yun-Hua Zhao & He-Feng Tong & Dar-Zen Chen & Mu-Hsuan Huang, 2012. "Scientific production and citation impact: a bibliometric analysis in acupuncture over three decades," Scientometrics, Springer;Akadémiai Kiadó, vol. 93(3), pages 1061-1079, December.
    11. Krist Vaesen & Joel Katzav, 2017. "How much would each researcher receive if competitive government research funding were distributed equally among researchers?," PLOS ONE, Public Library of Science, vol. 12(9), pages 1-11, September.
    12. Pleun Arensbergen & Inge van der Weijden & Peter Besselaar, 2012. "Gender differences in scientific productivity: a persisting phenomenon?," Scientometrics, Springer;Akadémiai Kiadó, vol. 93(3), pages 857-868, December.
    13. Loet Leydesdorff, 2013. "An evaluation of impacts in “Nanoscience & nanotechnology”: steps towards standards for citation analysis," Scientometrics, Springer;Akadémiai Kiadó, vol. 94(1), pages 35-55, January.
    14. Opthof, Tobias & Leydesdorff, Loet, 2010. "Caveats for the journal and field normalizations in the CWTS (“Leiden”) evaluations of research performance," Journal of Informetrics, Elsevier, vol. 4(3), pages 423-430.
    15. Adriana Bin & Sergio Salles-Filho & Luiza Maria Capanema & Fernando Antonio Basile Colugnati, 2015. "What difference does it make? Impact of peer-reviewed scholarships on scientific production," Scientometrics, Springer;Akadémiai Kiadó, vol. 102(2), pages 1167-1188, February.
    16. Tobias Opthof & Loet Leydesdorff, 2011. "A comment to the paper by Waltman et al., Scientometrics, 87, 467–481, 2011," Scientometrics, Springer;Akadémiai Kiadó, vol. 88(3), pages 1011-1016, September.
    17. Peter van den Besselaar & Ulf Sandström & Hélène Schiffbaenker, 2018. "Studying grant decision-making: a linguistic analysis of review reports," Scientometrics, Springer;Akadémiai Kiadó, vol. 117(1), pages 313-329, October.
    18. Dangzhi Zhao, 2010. "Characteristics and impact of grant-funded research: a case study of the library and information science field," Scientometrics, Springer;Akadémiai Kiadó, vol. 84(2), pages 293-306, August.
    19. van den Besselaar, Peter, 2012. "Selection committee membership: Service or self-service," Journal of Informetrics, Elsevier, vol. 6(4), pages 580-585.
    20. Andrea Bonaccorsi & Luca Secondi, 2017. "The determinants of research performance in European universities: a large scale multilevel analysis," Scientometrics, Springer;Akadémiai Kiadó, vol. 112(3), pages 1147-1178, September.
    21. Squazzoni, Flaminio & Gandelli, Claudio, 2012. "Saint Matthew strikes again: An agent-based model of peer review and the scientific community structure," Journal of Informetrics, Elsevier, vol. 6(2), pages 265-275.
    22. Sander Gerritsen & Karen van der Wiel & Erik Plug, 2013. "Up or out? How individual research grants affect academic careers in the Netherlands," CPB Discussion Paper 249, CPB Netherlands Bureau for Economic Policy Analysis.
    23. Maaike Verbree & Edwin Horlings & Peter Groenewegen & Inge Weijden & Peter Besselaar, 2015. "Organizational factors influencing scholarly performance: a multivariate study of biomedical research groups," Scientometrics, Springer;Akadémiai Kiadó, vol. 102(1), pages 25-49, January.

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Bornmann, Lutz & Mutz, Rüdiger & Daniel, Hans-Dieter, 2010. "The h index research output measurement: Two approaches to enhance its accuracy," Journal of Informetrics, Elsevier, vol. 4(3), pages 407-414.
    2. Kevin W. Boyack & Caleb Smith & Richard Klavans, 2018. "Toward predicting research proposal success," Scientometrics, Springer;Akadémiai Kiadó, vol. 114(2), pages 449-461, February.
    3. Benda, Wim G.G. & Engels, Tim C.E., 2011. "The predictive validity of peer review: A selective review of the judgmental forecasting qualities of peers, and implications for innovation in science," International Journal of Forecasting, Elsevier, vol. 27(1), pages 166-182.
    4. Benda, Wim G.G. & Engels, Tim C.E., 2011. "The predictive validity of peer review: A selective review of the judgmental forecasting qualities of peers, and implications for innovation in science," International Journal of Forecasting, Elsevier, vol. 27(1), pages 166-182, January.
    5. Alonso, S. & Cabrerizo, F.J. & Herrera-Viedma, E. & Herrera, F., 2009. "h-Index: A review focused in its variants, computation and standardization for different scientific fields," Journal of Informetrics, Elsevier, vol. 3(4), pages 273-289.
    6. Vîiu, Gabriel-Alexandru, 2016. "A theoretical evaluation of Hirsch-type bibliometric indicators confronted with extreme self-citation," Journal of Informetrics, Elsevier, vol. 10(2), pages 552-566.
    7. Marian-Gabriel Hâncean & Matjaž Perc & Jürgen Lerner, 2021. "The coauthorship networks of the most productive European researchers," Scientometrics, Springer;Akadémiai Kiadó, vol. 126(1), pages 201-224, January.
    8. Niklas Elert, 2014. "What determines entry? Evidence from Sweden," The Annals of Regional Science, Springer;Western Regional Science Association, vol. 53(1), pages 55-92, August.
    9. van den Besselaar, Peter & Sandström, Ulf, 2015. "Early career grants, performance, and careers: A study on predictive validity of grant decisions," Journal of Informetrics, Elsevier, vol. 9(4), pages 826-838.
    10. Bornmann, Lutz & Leydesdorff, Loet, 2013. "The validation of (advanced) bibliometric indicators through peer assessments: A comparative study using data from InCites and F1000," Journal of Informetrics, Elsevier, vol. 7(2), pages 286-291.
    11. van Raan, A. F. J. & van Leeuwen, Th. N., 2002. "Assessment of the scientific basis of interdisciplinary, applied research: Application of bibliometric methods in Nutrition and Food Research," Research Policy, Elsevier, vol. 31(4), pages 611-632, May.
    12. Mahmoud Ibrahim Fallatah, 2021. "Innovating in the Desert: a Network Perspective on Knowledge Creation in Developing Countries," Journal of the Knowledge Economy, Springer;Portland International Center for Management of Engineering and Technology (PICMET), vol. 12(3), pages 1533-1551, September.
    13. Wildgaard, Lorna, 2016. "A critical cluster analysis of 44 indicators of author-level performance," Journal of Informetrics, Elsevier, vol. 10(4), pages 1055-1078.
    14. Sabrina Petersohn & Thomas Heinze, 2018. "Professionalization of bibliometric research assessment. Insights from the history of the Leiden Centre for Science and Technology Studies (CWTS)," Science and Public Policy, Oxford University Press, vol. 45(4), pages 565-578.
    15. Brady Lund, 2019. "Examination of correlates of H-index as a measure of research productivity for library and information science faculty in the United States and Canada," Scientometrics, Springer;Akadémiai Kiadó, vol. 120(2), pages 897-915, August.
    16. Bornmann, Lutz & Leydesdorff, Loet, 2015. "Does quality and content matter for citedness? A comparison with para-textual factors and over time," Journal of Informetrics, Elsevier, vol. 9(3), pages 419-429.
    17. Lorna Wildgaard & Jesper W. Schneider & Birger Larsen, 2014. "A review of the characteristics of 108 author-level bibliometric indicators," Scientometrics, Springer;Akadémiai Kiadó, vol. 101(1), pages 125-158, October.
    18. Bornmann, Lutz & Daniel, Hans-Dieter, 2009. "Extent of type I and type II errors in editorial decisions: A case study on Angewandte Chemie International Edition," Journal of Informetrics, Elsevier, vol. 3(4), pages 348-352.
    19. Faria, Pedro & Sofka, Wolfgang, 2008. "Formal and Strategic Appropriability Strategies of Multinational Firms: A Cross Country Comparison," ZEW Discussion Papers 08-030, ZEW - Leibniz Centre for European Economic Research.
    20. Maaike Verbree & Edwin Horlings & Peter Groenewegen & Inge Weijden & Peter Besselaar, 2015. "Organizational factors influencing scholarly performance: a multivariate study of biomedical research groups," Scientometrics, Springer;Akadémiai Kiadó, vol. 102(1), pages 25-49, January.

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:eee:infome:v:4:y:2010:i:3:p:211-220. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: Catherine Liu (email available below). General contact details of provider: http://www.elsevier.com/locate/joi .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.