IDEAS home Printed from https://ideas.repec.org/a/bla/jinfst/v64y2013i11p2299-2309.html
   My bibliography  Save this article

Field-normalized impact factors (IFs): A comparison of rescaling and fractionally counted IFs

Author

Listed:
  • Loet Leydesdorff
  • Filippo Radicchi
  • Lutz Bornmann
  • Claudio Castellano
  • Wouter Nooy

Abstract

Two methods for comparing impact factors and citation rates across fields of science are tested against each other using citations to the 3,705 journals in the Science Citation Index 2010 (CD‐Rom version of SCI) and the 13 field categories used for the Science and Engineering Indicators of the U.S. National Science Board. We compare (a) normalization by counting citations in proportion to the length of the reference list (1/N of references) with (b) rescaling by dividing citation scores by the arithmetic mean of the citation rate of the cluster. Rescaling is analytical and therefore independent of the quality of the attribution to the sets, whereas fractional counting provides an empirical strategy for normalization among sets (by evaluating the between‐group variance). By the fairness test of Radicchi and Castellano (), rescaling outperforms fractional counting of citations for reasons that we consider.
(This abstract was borrowed from another version of this item.)

Suggested Citation

  • Loet Leydesdorff & Filippo Radicchi & Lutz Bornmann & Claudio Castellano & Wouter Nooy, 2013. "Field-normalized impact factors (IFs): A comparison of rescaling and fractionally counted IFs," Journal of the Association for Information Science & Technology, Association for Information Science & Technology, vol. 64(11), pages 2299-2309, November.
  • Handle: RePEc:bla:jinfst:v:64:y:2013:i:11:p:2299-2309
    as

    Download full text from publisher

    File URL: http://hdl.handle.net/10.1002/asi.22911
    Download Restriction: Access to full text is restricted to subscribers.
    ---><---

    As the access to this document is restricted, you may want to look for a different version below or search for a different version of it.

    Other versions of this item:

    Citations

    Citations are extracted by the CitEc Project, subscribe to its RSS feed for this item.
    as


    Cited by:

    1. Fairclough, Ruth & Thelwall, Mike, 2015. "National research impact indicators from Mendeley readers," Journal of Informetrics, Elsevier, vol. 9(4), pages 845-859.
    2. George Vrettas & Mark Sanderson, 2015. "Conferences versus journals in computer science," Journal of the Association for Information Science & Technology, Association for Information Science & Technology, vol. 66(12), pages 2674-2684, December.
    3. Cristiano Varin & Manuela Cattelan & David Firth, 2016. "Statistical modelling of citation exchange between statistics journals," Journal of the Royal Statistical Society Series A, Royal Statistical Society, vol. 179(1), pages 1-63, January.
    4. Cristian Colliander & Per Ahlgren, 2019. "Comparison of publication-level approaches to ex-post citation normalization," Scientometrics, Springer;Akadémiai Kiadó, vol. 120(1), pages 283-300, July.
    5. Bornmann, Lutz & Haunschild, Robin, 2016. "Normalization of Mendeley reader impact on the reader- and paper-side: A comparison of the mean discipline normalized reader score (MDNRS) with the mean normalized reader score (MNRS) and bare reader ," Journal of Informetrics, Elsevier, vol. 10(3), pages 776-788.
    6. Bouyssou, Denis & Marchant, Thierry, 2016. "Ranking authors using fractional counting of citations: An axiomatic approach," Journal of Informetrics, Elsevier, vol. 10(1), pages 183-199.
    7. Pedro Albarrán & Antonio Perianes-Rodríguez & Javier Ruiz-Castillo, 2015. "Differences in citation impact across countries," Journal of the Association for Information Science & Technology, Association for Information Science & Technology, vol. 66(3), pages 512-525, March.
    8. Hiran H. Lathabai & Thara Prabhakaran, 2023. "Contextual Ψ-index and its estimate for contextual productivity assessment," Scientometrics, Springer;Akadémiai Kiadó, vol. 128(8), pages 4875-4886, August.
    9. Waltman, Ludo & van Eck, Nees Jan, 2013. "A systematic empirical comparison of different approaches for normalizing citation impact indicators," Journal of Informetrics, Elsevier, vol. 7(4), pages 833-849.
    10. Li, Yunrong & Radicchi, Filippo & Castellano, Claudio & Ruiz-Castillo, Javier, 2013. "Quantitative evaluation of alternative field normalization procedures," Journal of Informetrics, Elsevier, vol. 7(3), pages 746-755.
    11. Elizabeth S. Vieira, 2023. "The influence of research collaboration on citation impact: the countries in the European Innovation Scoreboard," Scientometrics, Springer;Akadémiai Kiadó, vol. 128(6), pages 3555-3579, June.
    12. Li, Menghui & Yang, Liying & Zhang, Huina & Shen, Zhesi & Wu, Chensheng & Wu, Jinshan, 2017. "Do mathematicians, economists and biomedical scientists trace large topics more strongly than physicists?," Journal of Informetrics, Elsevier, vol. 11(2), pages 598-607.
    13. Lutz Bornmann & Werner Marx & Andreas Barth, 2013. "The Normalization of Citation Counts Based on Classification Systems," Publications, MDPI, vol. 1(2), pages 1-9, August.
    14. Waltman, Ludo, 2016. "A review of the literature on citation impact indicators," Journal of Informetrics, Elsevier, vol. 10(2), pages 365-391.
    15. Li, Yunrong & Ruiz-Castillo, Javier, 2013. "The comparison of normalization procedures based on different classification systems," Journal of Informetrics, Elsevier, vol. 7(4), pages 945-958.
    16. Bornmann, Lutz & Marx, Werner, 2015. "Methods for the generation of normalized citation impact scores in bibliometrics: Which method best reflects the judgements of experts?," Journal of Informetrics, Elsevier, vol. 9(2), pages 408-418.
    17. Mingers, John & Leydesdorff, Loet, 2015. "A review of theory and practice in scientometrics," European Journal of Operational Research, Elsevier, vol. 246(1), pages 1-19.
    18. Bornmann, Lutz & Haunschild, Robin, 2016. "Citation score normalized by cited references (CSNCR): The introduction of a new citation impact indicator," Journal of Informetrics, Elsevier, vol. 10(3), pages 875-887.
    19. Ennas, Gianfranco & Di Guardo, Maria Chiara, 2015. "Features of top-rated gold open access journals: An analysis of the scopus database," Journal of Informetrics, Elsevier, vol. 9(1), pages 79-89.
    20. Lutz Bornmann & Alexander Tekles & Loet Leydesdorff, 2019. "How well does I3 perform for impact measurement compared to other bibliometric indicators? The convergent validity of several (field-normalized) indicators," Scientometrics, Springer;Akadémiai Kiadó, vol. 119(2), pages 1187-1205, May.

    More about this item

    Statistics

    Access and download statistics

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:bla:jinfst:v:64:y:2013:i:11:p:2299-2309. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    We have no bibliographic references for this item. You can help adding them by using this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: Wiley Content Delivery (email available below). General contact details of provider: http://www.asis.org .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.