IDEAS home Printed from https://ideas.repec.org/a/eee/infome/v5y2011i1p101-113.html
   My bibliography  Save this article

The effects and their stability of field normalization baseline on relative performance with respect to citation impact: A case study of 20 natural science departments

Author

Listed:
  • Colliander, Cristian
  • Ahlgren, Per

Abstract

In this paper we study the effects of field normalization baseline on relative performance of 20 natural science departments in terms of citation impact. Impact is studied under three baselines: journal, ISI/Thomson Reuters subject category, and Essential Science Indicators field. For the measurement of citation impact, the indicators item-oriented mean normalized citation rate and Top-5% are employed. The results, which we analyze with respect to stability, show that the choice of normalization baseline matters. We observe that normalization against publishing journal is particular. The rankings of the departments obtained when journal is used as baseline, irrespective of indicator, differ considerably from the rankings obtained when ISI/Thomson Reuters subject category or Essential Science Indicators field is used. Since no substantial differences are observed when the baselines Essential Science Indicators field and ISI/Thomson Reuters subject category are contrasted, one might suggest that people without access to subject category data can perform reasonable normalized citation impact studies by combining normalization against journal with normalization against Essential Science Indicators field.

Suggested Citation

  • Colliander, Cristian & Ahlgren, Per, 2011. "The effects and their stability of field normalization baseline on relative performance with respect to citation impact: A case study of 20 natural science departments," Journal of Informetrics, Elsevier, vol. 5(1), pages 101-113.
  • Handle: RePEc:eee:infome:v:5:y:2011:i:1:p:101-113
    DOI: 10.1016/j.joi.2010.09.003
    as

    Download full text from publisher

    File URL: http://www.sciencedirect.com/science/article/pii/S1751157710000866
    Download Restriction: Full text for ScienceDirect subscribers only

    File URL: https://libkey.io/10.1016/j.joi.2010.09.003?utm_source=ideas
    LibKey link: if access is restricted and if your library uses this service, LibKey will redirect you to where you can use your library subscription to access this item
    ---><---

    As the access to this document is restricted, you may want to search for a different version of it.

    References listed on IDEAS

    as
    1. Strotmann, Andreas & Zhao, Dangzhi, 2010. "Combining commercial citation indexes and open-access bibliographic databases to delimit highly interdisciplinary research fields for citation analysis," Journal of Informetrics, Elsevier, vol. 4(2), pages 194-200.
    2. Rafael Ball & Bernhard Mittermaier & Dirk Tunger, 2009. "Creation of journal-based publication profiles of scientific institutions — A methodology for the interdisciplinary comparison of scientific research based on the J-factor," Scientometrics, Springer;Akadémiai Kiadó, vol. 81(2), pages 381-392, November.
    3. van Raan, Anthony F.J. & van Leeuwen, Thed N. & Visser, Martijn S. & van Eck, Nees Jan & Waltman, Ludo, 2010. "Rivals for the crown: Reply to Opthof and Leydesdorff," Journal of Informetrics, Elsevier, vol. 4(3), pages 431-435.
    4. Wolfgang Glänzel & Bart Thijs & András Schubert & Koenraad Debackere, 2009. "Subfield-specific normalized relative indicators and a new generation of relational charts: Methodological foundations illustrated on the assessment of institutional research performance," Scientometrics, Springer;Akadémiai Kiadó, vol. 78(1), pages 165-188, January.
    5. Michel Zitt & Suzy Ramanana-Rahary & Elise Bassecoulard, 2005. "Relativity of citation performance and excellence measures: From cross-field to cross-scale effects of field-normalisation," Scientometrics, Springer;Akadémiai Kiadó, vol. 63(2), pages 373-401, April.
    6. W. Glänzel & A. Schubert & U. Schoepflin & H. J. Czerwon, 1999. "An item-by-item subject classification of papers published in journals covered by the SSCI database using reference analysis," Scientometrics, Springer;Akadémiai Kiadó, vol. 46(3), pages 431-441, November.
    7. W. Glänzel & A. Schubert & H. -J. Czerwon, 1999. "An item-by-item subject classification of papers published in multidisciplinary and general journals using reference analysis," Scientometrics, Springer;Akadémiai Kiadó, vol. 44(3), pages 427-439, March.
    8. Christoph Neuhaus & Hans-Dieter Daniel, 2009. "A new reference standard for citation analysis in chemistry and related fields based on the sections of Chemical Abstracts," Scientometrics, Springer;Akadémiai Kiadó, vol. 78(2), pages 219-229, February.
    9. Moed, Henk F., 2010. "Measuring contextual citation impact of scientific journals," Journal of Informetrics, Elsevier, vol. 4(3), pages 265-277.
    10. Lundberg, Jonas, 2007. "Lifting the crown—citation z-score," Journal of Informetrics, Elsevier, vol. 1(2), pages 145-154.
    11. Opthof, Tobias & Leydesdorff, Loet, 2010. "Caveats for the journal and field normalizations in the CWTS (“Leiden”) evaluations of research performance," Journal of Informetrics, Elsevier, vol. 4(3), pages 423-430.
    12. Robert J. W. Tijssen & Martijn S. Visser & Thed N. van Leeuwen, 2002. "Benchmarking international scientific excellence: Are highly cited research papers an appropriate frame of reference?," Scientometrics, Springer;Akadémiai Kiadó, vol. 54(3), pages 381-397, July.
    13. Ronald N. Kostoff, 2002. "Citation analysis of research performer quality," Scientometrics, Springer;Akadémiai Kiadó, vol. 53(1), pages 49-71, January.
    Full references (including those not matched with items on IDEAS)

    Citations

    Citations are extracted by the CitEc Project, subscribe to its RSS feed for this item.
    as


    Cited by:

    1. Marcel Clermont & Alexander Dirksen & Barbara Scheidt & Dirk Tunger, 2017. "Citation metrics as an additional indicator for evaluating research performance? An analysis of their correlations and validity," Business Research, Springer;German Academic Association for Business Research, vol. 10(2), pages 249-279, October.
    2. Ludo Waltman & Clara Calero-Medina & Joost Kosten & Ed C.M. Noyons & Robert J.W. Tijssen & Nees Jan Eck & Thed N. Leeuwen & Anthony F.J. Raan & Martijn S. Visser & Paul Wouters, 2012. "The Leiden ranking 2011/2012: Data collection, indicators, and interpretation," Journal of the Association for Information Science & Technology, Association for Information Science & Technology, vol. 63(12), pages 2419-2432, December.
    3. Schneider, Jesper W., 2013. "Caveats for using statistical significance tests in research assessments," Journal of Informetrics, Elsevier, vol. 7(1), pages 50-62.
    4. Cinzia Daraio & Simone Di Leo & Loet Leydesdorff, 2023. "A heuristic approach based on Leiden rankings to identify outliers: evidence from Italian universities in the European landscape," Scientometrics, Springer;Akadémiai Kiadó, vol. 128(1), pages 483-510, January.
    5. Dunaiski, Marcel & Geldenhuys, Jaco & Visser, Willem, 2019. "On the interplay between normalisation, bias, and performance of paper impact metrics," Journal of Informetrics, Elsevier, vol. 13(1), pages 270-290.
    6. Waltman, Ludo, 2016. "A review of the literature on citation impact indicators," Journal of Informetrics, Elsevier, vol. 10(2), pages 365-391.
    7. Ludo Waltman & Michael Schreiber, 2013. "On the calculation of percentile-based bibliometric indicators," Journal of the Association for Information Science & Technology, Association for Information Science & Technology, vol. 64(2), pages 372-379, February.
    8. Cinzia Daraio & Simone Di Leo & Loet Leydesdorff, 2022. "Using the Leiden Rankings as a Heuristics: Evidence from Italian universities in the European landscape," LEM Papers Series 2022/08, Laboratory of Economics and Management (LEM), Sant'Anna School of Advanced Studies, Pisa, Italy.
    9. Ludo Waltman & Nees Jan Eck & Thed N. Leeuwen & Martijn S. Visser & Anthony F. J. Raan, 2011. "Towards a new crown indicator: an empirical analysis," Scientometrics, Springer;Akadémiai Kiadó, vol. 87(3), pages 467-481, June.
    10. Abramo, Giovanni & D’Angelo, Ciriaco Andrea & Grilli, Leonardo, 2015. "Funnel plots for visualizing uncertainty in the research performance of institutions," Journal of Informetrics, Elsevier, vol. 9(4), pages 954-961.
    11. Dunaiski, Marcel & Geldenhuys, Jaco & Visser, Willem, 2019. "Globalised vs averaged: Bias and ranking performance on the author level," Journal of Informetrics, Elsevier, vol. 13(1), pages 299-313.
    12. Ruiz-Castillo, Javier & Waltman, Ludo, 2015. "Field-normalized citation impact indicators using algorithmically constructed classification systems of science," Journal of Informetrics, Elsevier, vol. 9(1), pages 102-117.
    13. Hu, Zhigang & Tian, Wencan & Xu, Shenmeng & Zhang, Chunbo & Wang, Xianwen, 2018. "Four pitfalls in normalizing citation indicators: An investigation of ESI’s selection of highly cited papers," Journal of Informetrics, Elsevier, vol. 12(4), pages 1133-1145.
    14. Vinkler, Péter, 2013. "Comparative rank assessment of journal articles," Journal of Informetrics, Elsevier, vol. 7(3), pages 712-717.
    15. Pislyakov, Vladimir, 2022. "On some properties of medians, percentiles, baselines, and thresholds in empirical bibliometric analysis," Journal of Informetrics, Elsevier, vol. 16(4).
    16. Vaccario, Giacomo & Medo, Matúš & Wider, Nicolas & Mariani, Manuel Sebastian, 2017. "Quantifying and suppressing ranking bias in a large citation network," Journal of Informetrics, Elsevier, vol. 11(3), pages 766-782.

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Waltman, Ludo, 2016. "A review of the literature on citation impact indicators," Journal of Informetrics, Elsevier, vol. 10(2), pages 365-391.
    2. Ludo Waltman & Nees Jan Eck, 2013. "Source normalized indicators of citation impact: an overview of different approaches and an empirical comparison," Scientometrics, Springer;Akadémiai Kiadó, vol. 96(3), pages 699-716, September.
    3. Zhou, Ping & Leydesdorff, Loet, 2011. "Fractional counting of citations in research evaluation: A cross- and interdisciplinary assessment of the Tsinghua University in Beijing," Journal of Informetrics, Elsevier, vol. 5(3), pages 360-368.
    4. Waltman, Ludo & van Eck, Nees Jan & van Leeuwen, Thed N. & Visser, Martijn S. & van Raan, Anthony F.J., 2011. "Towards a new crown indicator: Some theoretical considerations," Journal of Informetrics, Elsevier, vol. 5(1), pages 37-47.
    5. Loet Leydesdorff, 2012. "Alternatives to the journal impact factor: I3 and the top-10% (or top-25%?) of the most-highly cited papers," Scientometrics, Springer;Akadémiai Kiadó, vol. 92(2), pages 355-365, August.
    6. Bouyssou, Denis & Marchant, Thierry, 2016. "Ranking authors using fractional counting of citations: An axiomatic approach," Journal of Informetrics, Elsevier, vol. 10(1), pages 183-199.
    7. Pablo Dorta-González & María Isabel Dorta-González & Rafael Suárez-Vega, 2015. "An approach to the author citation potential: measures of scientific performance which are invariant across scientific fields," Scientometrics, Springer;Akadémiai Kiadó, vol. 102(2), pages 1467-1496, February.
    8. Ludo Waltman & Nees Jan Eck & Thed N. Leeuwen & Martijn S. Visser & Anthony F. J. Raan, 2011. "Towards a new crown indicator: an empirical analysis," Scientometrics, Springer;Akadémiai Kiadó, vol. 87(3), pages 467-481, June.
    9. Rons, Nadine, 2012. "Partition-based Field Normalization: An approach to highly specialized publication records," Journal of Informetrics, Elsevier, vol. 6(1), pages 1-10.
    10. Bornmann, Lutz & Leydesdorff, Loet & Wang, Jian, 2013. "Which percentile-based approach should be preferred for calculating normalized citation impact values? An empirical comparison of five approaches including a newly developed citation-rank approach (P1," Journal of Informetrics, Elsevier, vol. 7(4), pages 933-944.
    11. Lutz Bornmann & Klaus Wohlrabe, 2019. "Normalisation of citation impact in economics," Scientometrics, Springer;Akadémiai Kiadó, vol. 120(2), pages 841-884, August.
    12. Loet Leydesdorff, 2013. "An evaluation of impacts in “Nanoscience & nanotechnology”: steps towards standards for citation analysis," Scientometrics, Springer;Akadémiai Kiadó, vol. 94(1), pages 35-55, January.
    13. Loet Leydesdorff & Paul Wouters & Lutz Bornmann, 2016. "Professional and citizen bibliometrics: complementarities and ambivalences in the development and use of indicators—a state-of-the-art report," Scientometrics, Springer;Akadémiai Kiadó, vol. 109(3), pages 2129-2150, December.
    14. Franceschini, Fiorenzo & Maisano, Domenico, 2014. "Sub-field normalization of the IEEE scientific journals based on their connection with Technical Societies," Journal of Informetrics, Elsevier, vol. 8(3), pages 508-533.
    15. Vinkler, Péter, 2012. "The case of scientometricians with the “absolute relative” impact indicator," Journal of Informetrics, Elsevier, vol. 6(2), pages 254-264.
    16. Waltman, Ludo & van Eck, Nees Jan, 2013. "A systematic empirical comparison of different approaches for normalizing citation impact indicators," Journal of Informetrics, Elsevier, vol. 7(4), pages 833-849.
    17. Larivière, Vincent & Gingras, Yves, 2011. "Averages of ratios vs. ratios of averages: An empirical analysis of four levels of aggregation," Journal of Informetrics, Elsevier, vol. 5(3), pages 392-399.
    18. Ludo Waltman & Erjia Yan & Nees Jan Eck, 2011. "A recursive field-normalized bibliometric performance indicator: an application to the field of library and information science," Scientometrics, Springer;Akadémiai Kiadó, vol. 89(1), pages 301-314, October.
    19. Tahamtan, Iman & Bornmann, Lutz, 2018. "Creativity in science and the link to cited references: Is the creative potential of papers reflected in their cited references?," Journal of Informetrics, Elsevier, vol. 12(3), pages 906-930.
    20. Abramo, Giovanni & D’Angelo, Ciriaco Andrea, 2016. "A farewell to the MNCS and like size-independent indicators," Journal of Informetrics, Elsevier, vol. 10(2), pages 646-651.

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:eee:infome:v:5:y:2011:i:1:p:101-113. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: Catherine Liu (email available below). General contact details of provider: http://www.elsevier.com/locate/joi .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.