IDEAS home Printed from https://ideas.repec.org/a/eee/infome/v9y2015i2p408-418.html
   My bibliography  Save this article

Methods for the generation of normalized citation impact scores in bibliometrics: Which method best reflects the judgements of experts?

Author

Listed:
  • Bornmann, Lutz
  • Marx, Werner

Abstract

Evaluative bibliometrics compare the citation impact of researchers, research groups and institutions with each other across time scales and disciplines. Both factors, discipline and period – have an influence on the citation count which is independent of the quality of the publication. Normalizing the citation impact of papers for these two factors started in the mid-1980s. Since then, a range of different methods have been presented for producing normalized citation impact scores. The current study uses a data set of over 50,000 records to test which of the methods so far presented correlate better with the assessment of papers by peers. The peer assessments come from F1000Prime – a post-publication peer review system of the biomedical literature. Of the normalized indicators, the current study involves not only cited-side indicators, such as the mean normalized citation score, but also citing-side indicators. As the results show, the correlations of the indicators with the peer assessments all turn out to be very similar. Since F1000 focuses on biomedicine, it is important that the results of this study are validated by other studies based on datasets from other disciplines or (ideally) based on multi-disciplinary datasets.

Suggested Citation

  • Bornmann, Lutz & Marx, Werner, 2015. "Methods for the generation of normalized citation impact scores in bibliometrics: Which method best reflects the judgements of experts?," Journal of Informetrics, Elsevier, vol. 9(2), pages 408-418.
  • Handle: RePEc:eee:infome:v:9:y:2015:i:2:p:408-418
    DOI: 10.1016/j.joi.2015.01.006
    as

    Download full text from publisher

    File URL: http://www.sciencedirect.com/science/article/pii/S1751157715000073
    Download Restriction: Full text for ScienceDirect subscribers only

    File URL: https://libkey.io/10.1016/j.joi.2015.01.006?utm_source=ideas
    LibKey link: if access is restricted and if your library uses this service, LibKey will redirect you to where you can use your library subscription to access this item
    ---><---

    As the access to this document is restricted, you may want to search for a different version of it.

    References listed on IDEAS

    as
    1. Loet Leydesdorff & Filippo Radicchi & Lutz Bornmann & Claudio Castellano & Wouter Nooy, 2013. "Field-normalized impact factors (IFs): A comparison of rescaling and fractionally counted IFs," Journal of the Association for Information Science & Technology, Association for Information Science & Technology, vol. 64(11), pages 2299-2309, November.
    2. Bornmann, Lutz & Leydesdorff, Loet & Wang, Jian, 2013. "Which percentile-based approach should be preferred for calculating normalized citation impact values? An empirical comparison of five approaches including a newly developed citation-rank approach (P1," Journal of Informetrics, Elsevier, vol. 7(4), pages 933-944.
    3. Lutz Bornmann & Rüdiger Mutz, 2014. "From P100 to P100': A new citation-rank approach," Journal of the Association for Information Science & Technology, Association for Information Science & Technology, vol. 65(9), pages 1939-1943, September.
    4. Bornmann, Lutz & Leydesdorff, Loet, 2013. "The validation of (advanced) bibliometric indicators through peer assessments: A comparative study using data from InCites and F1000," Journal of Informetrics, Elsevier, vol. 7(2), pages 286-291.
    5. Bornmann, Lutz & Leydesdorff, Loet & Mutz, Rüdiger, 2013. "The use of percentiles and percentile rank classes in the analysis of bibliometric data: Opportunities and limits," Journal of Informetrics, Elsevier, vol. 7(1), pages 158-165.
    6. Michel Zitt & Suzy Ramanana-Rahary & Elise Bassecoulard, 2005. "Relativity of citation performance and excellence measures: From cross-field to cross-scale effects of field-normalisation," Scientometrics, Springer;Akadémiai Kiadó, vol. 63(2), pages 373-401, April.
    7. Loet Leydesdorff & Lutz Bornmann, 2011. "How fractional counting of citations affects the impact factor: Normalization in terms of differences in citation potentials among fields of science," Journal of the American Society for Information Science and Technology, Association for Information Science & Technology, vol. 62(2), pages 217-229, February.
    8. Schreiber, Michael, 2014. "Is the new citation-rank approach P100′ in bibliometrics really new?," Journal of Informetrics, Elsevier, vol. 8(4), pages 997-1004.
    9. Schreiber, Michael, 2014. "Examples for counterintuitive behavior of the new citation-rank indicator P100 for bibliometric evaluations," Journal of Informetrics, Elsevier, vol. 8(3), pages 738-748.
    10. Lutz Bornmann & Werner Marx, 2014. "The wisdom of citing scientists," Journal of the Association for Information Science & Technology, Association for Information Science & Technology, vol. 65(6), pages 1288-1292, June.
    11. Ludo Waltman & Rodrigo Costas, 2014. "F1000 Recommendations as a Potential New Data Source for Research Evaluation: A Comparison With Citations," Journal of the Association for Information Science & Technology, Association for Information Science & Technology, vol. 65(3), pages 433-445, March.
    12. Bornmann, Lutz & Williams, Richard, 2013. "How to calculate the practical significance of citation impact differences? An empirical example from evaluative institutional bibliometrics using adjusted predictions and marginal effects," Journal of Informetrics, Elsevier, vol. 7(2), pages 562-574.
    13. Ruiz-Castillo, Javier & Waltman, Ludo, 2015. "Field-normalized citation impact indicators using algorithmically constructed classification systems of science," Journal of Informetrics, Elsevier, vol. 9(1), pages 102-117.
    14. Waltman, Ludo & van Eck, Nees Jan, 2013. "A systematic empirical comparison of different approaches for normalizing citation impact indicators," Journal of Informetrics, Elsevier, vol. 7(4), pages 833-849.
    15. Waltman, Ludo & van Eck, Nees Jan & van Leeuwen, Thed N. & Visser, Martijn S. & van Raan, Anthony F.J., 2011. "Towards a new crown indicator: Some theoretical considerations," Journal of Informetrics, Elsevier, vol. 5(1), pages 37-47.
    16. Werner Marx & Lutz Bornmann, 2015. "On the causes of subject-specific citation rates in Web of Science," Scientometrics, Springer;Akadémiai Kiadó, vol. 102(2), pages 1823-1827, February.
    17. Per O. Seglen, 1992. "The skewness of science," Journal of the American Society for Information Science, Association for Information Science & Technology, vol. 43(9), pages 628-638, October.
    18. Loet Leydesdorff & Jung C. Shin, 2011. "How to evaluate universities in terms of their relative citation impacts: Fractional counting of citations and the normalization of differences among disciplines," Journal of the Association for Information Science & Technology, Association for Information Science & Technology, vol. 62(6), pages 1146-1155, June.
    19. Lutz Bornmann, 2015. "Interrater reliability and convergent validity of F1000Prime peer review," Journal of the Association for Information Science & Technology, Association for Information Science & Technology, vol. 66(12), pages 2415-2426, December.
    20. Michel Zitt & Henry Small, 2008. "Modifying the journal impact factor by fractional citation weighting: The audience factor," Journal of the American Society for Information Science and Technology, Association for Information Science & Technology, vol. 59(11), pages 1856-1860, September.
    21. Ludo Waltman & Clara Calero-Medina & Joost Kosten & Ed C.M. Noyons & Robert J.W. Tijssen & Nees Jan Eck & Thed N. Leeuwen & Anthony F.J. Raan & Martijn S. Visser & Paul Wouters, 2012. "The Leiden ranking 2011/2012: Data collection, indicators, and interpretation," Journal of the Association for Information Science & Technology, Association for Information Science & Technology, vol. 63(12), pages 2419-2432, December.
    22. Ludo Waltman & Nees Jan Eck, 2013. "Source normalized indicators of citation impact: an overview of different approaches and an empirical comparison," Scientometrics, Springer;Akadémiai Kiadó, vol. 96(3), pages 699-716, September.
    23. Richard Williams, 2012. "Using the margins command to estimate and interpret adjusted predictions and marginal effects," Stata Journal, StataCorp LP, vol. 12(2), pages 308-331, June.
    Full references (including those not matched with items on IDEAS)

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Mingers, John & Leydesdorff, Loet, 2015. "A review of theory and practice in scientometrics," European Journal of Operational Research, Elsevier, vol. 246(1), pages 1-19.
    2. Bornmann, Lutz & Haunschild, Robin, 2016. "Citation score normalized by cited references (CSNCR): The introduction of a new citation impact indicator," Journal of Informetrics, Elsevier, vol. 10(3), pages 875-887.
    3. Waltman, Ludo, 2016. "A review of the literature on citation impact indicators," Journal of Informetrics, Elsevier, vol. 10(2), pages 365-391.
    4. Bornmann, Lutz & Haunschild, Robin, 2016. "Normalization of Mendeley reader impact on the reader- and paper-side: A comparison of the mean discipline normalized reader score (MDNRS) with the mean normalized reader score (MNRS) and bare reader ," Journal of Informetrics, Elsevier, vol. 10(3), pages 776-788.
    5. Bouyssou, Denis & Marchant, Thierry, 2016. "Ranking authors using fractional counting of citations: An axiomatic approach," Journal of Informetrics, Elsevier, vol. 10(1), pages 183-199.
    6. Lutz Bornmann & Alexander Tekles & Loet Leydesdorff, 2019. "How well does I3 perform for impact measurement compared to other bibliometric indicators? The convergent validity of several (field-normalized) indicators," Scientometrics, Springer;Akadémiai Kiadó, vol. 119(2), pages 1187-1205, May.
    7. Tolga Yuret, 2018. "Author-weighted impact factor and reference return ratio: can we attain more equality among fields?," Scientometrics, Springer;Akadémiai Kiadó, vol. 116(3), pages 2097-2111, September.
    8. Waltman, Ludo & van Eck, Nees Jan, 2013. "A systematic empirical comparison of different approaches for normalizing citation impact indicators," Journal of Informetrics, Elsevier, vol. 7(4), pages 833-849.
    9. Ahlgren, Per & Waltman, Ludo, 2014. "The correlation between citation-based and expert-based assessments of publication channels: SNIP and SJR vs. Norwegian quality assessments," Journal of Informetrics, Elsevier, vol. 8(4), pages 985-996.
    10. Bornmann, Lutz & Marx, Werner, 2018. "Critical rationalism and the search for standard (field-normalized) indicators in bibliometrics," Journal of Informetrics, Elsevier, vol. 12(3), pages 598-604.
    11. Liwei Cai & Jiahao Tian & Jiaying Liu & Xiaomei Bai & Ivan Lee & Xiangjie Kong & Feng Xia, 2019. "Scholarly impact assessment: a survey of citation weighting solutions," Scientometrics, Springer;Akadémiai Kiadó, vol. 118(2), pages 453-478, February.
    12. Lutz Bornmann & Klaus Wohlrabe, 2019. "Normalisation of citation impact in economics," Scientometrics, Springer;Akadémiai Kiadó, vol. 120(2), pages 841-884, August.
    13. Loet Leydesdorff & Paul Wouters & Lutz Bornmann, 2016. "Professional and citizen bibliometrics: complementarities and ambivalences in the development and use of indicators—a state-of-the-art report," Scientometrics, Springer;Akadémiai Kiadó, vol. 109(3), pages 2129-2150, December.
    14. Chen, Shiji & Arsenault, Clément & Larivière, Vincent, 2015. "Are top-cited papers more interdisciplinary?," Journal of Informetrics, Elsevier, vol. 9(4), pages 1034-1046.
    15. Lutz Bornmann, 2015. "Interrater reliability and convergent validity of F1000Prime peer review," Journal of the Association for Information Science & Technology, Association for Information Science & Technology, vol. 66(12), pages 2415-2426, December.
    16. Cristian Colliander & Per Ahlgren, 2019. "Comparison of publication-level approaches to ex-post citation normalization," Scientometrics, Springer;Akadémiai Kiadó, vol. 120(1), pages 283-300, July.
    17. Cristiano Varin & Manuela Cattelan & David Firth, 2016. "Statistical modelling of citation exchange between statistics journals," Journal of the Royal Statistical Society Series A, Royal Statistical Society, vol. 179(1), pages 1-63, January.
    18. Ruiz-Castillo, Javier & Waltman, Ludo, 2015. "Field-normalized citation impact indicators using algorithmically constructed classification systems of science," Journal of Informetrics, Elsevier, vol. 9(1), pages 102-117.
    19. Brito, Ricardo & Rodríguez-Navarro, Alonso, 2018. "Research assessment by percentile-based double rank analysis," Journal of Informetrics, Elsevier, vol. 12(1), pages 315-329.
    20. Bornmann, Lutz & Leydesdorff, Loet, 2015. "Does quality and content matter for citedness? A comparison with para-textual factors and over time," Journal of Informetrics, Elsevier, vol. 9(3), pages 419-429.

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:eee:infome:v:9:y:2015:i:2:p:408-418. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: Catherine Liu (email available below). General contact details of provider: http://www.elsevier.com/locate/joi .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.