IDEAS home Printed from https://ideas.repec.org/a/eee/infome/v18y2024i3s1751157724000440.html
   My bibliography  Save this article

The misuse of the nonlinear field normalization method: Nonlinear field normalization citation counts at the paper level should not be added or averaged

Author

Listed:
  • Wang, Xing

Abstract

Nonlinear field normalization citation counts at the paper level obtained using nonlinear field normalization methods should not be added or averaged. Unfortunately, there are many cases adding or averaging the nonlinear normalized citation counts of individual papers that can be found in the academic literature, indicating that nonlinear field normalization methods have long been misused in academia. In this paper, we performed the following three research works. First, we analyzed why the nonlinear normalized citation counts of individual papers should not be added or averaged from the perspective of theoretical analysis in mathematics: we provided mathematical proofs for the crucial steps of the analysis. Second, we systematically classified the existing main field normalization methods into linear and nonlinear field normalization methods. Third, we used real citation data to explore the error effects caused by adding or averaging the nonlinear normalized citation counts on practical research evaluation results. The above three research works provide a theoretical basis for the proper use of field normalization methods in the future. Furthermore, because our mathematical proof is applicable to all nonlinear data in the entire real number domain, our research works are also meaningful for the whole field of data and information science.

Suggested Citation

  • Wang, Xing, 2024. "The misuse of the nonlinear field normalization method: Nonlinear field normalization citation counts at the paper level should not be added or averaged," Journal of Informetrics, Elsevier, vol. 18(3).
  • Handle: RePEc:eee:infome:v:18:y:2024:i:3:s1751157724000440
    DOI: 10.1016/j.joi.2024.101531
    as

    Download full text from publisher

    File URL: http://www.sciencedirect.com/science/article/pii/S1751157724000440
    Download Restriction: Full text for ScienceDirect subscribers only

    File URL: https://libkey.io/10.1016/j.joi.2024.101531?utm_source=ideas
    LibKey link: if access is restricted and if your library uses this service, LibKey will redirect you to where you can use your library subscription to access this item
    ---><---

    As the access to this document is restricted, you may want to search for a different version of it.

    References listed on IDEAS

    as
    1. Filippo Radicchi & Claudio Castellano, 2012. "A Reverse Engineering Approach to the Suppression of Citation Biases Reveals Universal Properties of Citation Distributions," PLOS ONE, Public Library of Science, vol. 7(3), pages 1-9, March.
    2. Waltman, Ludo & van Eck, Nees Jan, 2013. "A systematic empirical comparison of different approaches for normalizing citation impact indicators," Journal of Informetrics, Elsevier, vol. 7(4), pages 833-849.
    3. Han Zheng & Dion Hoe‐Lian Goh & Edmund Wei Jian Lee & Chei Sian Lee & Yin‐Leng Theng, 2022. "Understanding the effects of message cues on COVID‐19 information sharing on Twitter," Journal of the Association for Information Science & Technology, Association for Information Science & Technology, vol. 73(6), pages 847-862, June.
    4. Juan A Crespo & Yungrong Li & Javier Ruiz–Castillo, 2013. "The Measurement of the Effect on Citation Inequality of Differences in Citation Practices across Scientific Fields," PLOS ONE, Public Library of Science, vol. 8(3), pages 1-9, March.
    5. Stephen Jackson & Nadia Vanteeva & Colm Fearon, 2019. "An Investigation of the Impact of Data Breach Severity on the Readability of Mandatory Data Breach Notification Letters: Evidence From U.S. Firms," Journal of the Association for Information Science & Technology, Association for Information Science & Technology, vol. 70(11), pages 1277-1289, November.
    6. Mike Thelwall & Nabeil Maflahi, 2020. "Academic collaboration rates and citation associations vary substantially between countries and fields," Journal of the Association for Information Science & Technology, Association for Information Science & Technology, vol. 71(8), pages 968-978, August.
    7. Loet Leydesdorff & Tobias Opthof, 2010. "Scopus's source normalized impact per paper (SNIP) versus a journal impact factor based on fractional counting of citations," Journal of the American Society for Information Science and Technology, Association for Information Science & Technology, vol. 61(11), pages 2365-2369, November.
    8. Waltman, Ludo & van Eck, Nees Jan & van Leeuwen, Thed N. & Visser, Martijn S. & van Raan, Anthony F.J., 2011. "Towards a new crown indicator: Some theoretical considerations," Journal of Informetrics, Elsevier, vol. 5(1), pages 37-47.
    9. Marcello D’Agostino & Valentino Dardanoni & Roberto Ghiselli Ricci, 2017. "How to standardize (if you must)," Scientometrics, Springer;Akadémiai Kiadó, vol. 113(2), pages 825-843, November.
    10. Ludo Waltman & Nees Jan Eck & Thed N. Leeuwen & Martijn S. Visser & Anthony F. J. Raan, 2011. "Towards a new crown indicator: an empirical analysis," Scientometrics, Springer;Akadémiai Kiadó, vol. 87(3), pages 467-481, June.
    11. Zhihui Zhang & Ying Cheng & Nian Cai Liu, 2014. "Comparison of the effect of mean-based method and z-score for field normalization of citations at the level of Web of Science subject categories," Scientometrics, Springer;Akadémiai Kiadó, vol. 101(3), pages 1679-1693, December.
    12. Thelwall, Mike, 2017. "Three practical field normalised alternative indicator formulae for research evaluation," Journal of Informetrics, Elsevier, vol. 11(1), pages 128-151.
    13. Lutz Bornmann & Richard Williams, 2020. "An evaluation of percentile measures of citation impact, and a proposal for making them better," Scientometrics, Springer;Akadémiai Kiadó, vol. 124(2), pages 1457-1478, August.
    14. Franceschini, Fiorenzo & Maisano, Domenico, 2017. "Critical remarks on the Italian research assessment exercise VQR 2011–2014," Journal of Informetrics, Elsevier, vol. 11(2), pages 337-357.
    15. Michel Zitt & Henry Small, 2008. "Modifying the journal impact factor by fractional citation weighting: The audience factor," Journal of the American Society for Information Science and Technology, Association for Information Science & Technology, vol. 59(11), pages 1856-1860, September.
    16. B Ian Hutchins & Xin Yuan & James M Anderson & George M Santangelo, 2016. "Relative Citation Ratio (RCR): A New Metric That Uses Citation Rates to Measure Influence at the Article Level," PLOS Biology, Public Library of Science, vol. 14(9), pages 1-25, September.
    17. Zhou, Ping & Zhong, Yongfeng, 2012. "The citation-based indicator and combined impact indicator—New options for measuring impact," Journal of Informetrics, Elsevier, vol. 6(4), pages 631-638.
    18. Zhihui Zhang & Ying Cheng & Nian Cai Liu, 2015. "Improving the normalization effect of mean-based method from the perspective of optimization: optimization-based linear methods and their performance," Scientometrics, Springer;Akadémiai Kiadó, vol. 102(1), pages 587-607, January.
    Full references (including those not matched with items on IDEAS)

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Waltman, Ludo, 2016. "A review of the literature on citation impact indicators," Journal of Informetrics, Elsevier, vol. 10(2), pages 365-391.
    2. Bouyssou, Denis & Marchant, Thierry, 2016. "Ranking authors using fractional counting of citations: An axiomatic approach," Journal of Informetrics, Elsevier, vol. 10(1), pages 183-199.
    3. Wang, Xing & Zhang, Zhihui, 2020. "Improving the reliability of short-term citation impact indicators by taking into account the correlation between short- and long-term citation impact," Journal of Informetrics, Elsevier, vol. 14(2).
    4. Mingers, John & Leydesdorff, Loet, 2015. "A review of theory and practice in scientometrics," European Journal of Operational Research, Elsevier, vol. 246(1), pages 1-19.
    5. Bornmann, Lutz & Haunschild, Robin & Mutz, Rüdiger, 2020. "Should citations be field-normalized in evaluative bibliometrics? An empirical analysis based on propensity score matching," Journal of Informetrics, Elsevier, vol. 14(4).
    6. Thelwall, Mike & Fairclough, Ruth, 2017. "The accuracy of confidence intervals for field normalised indicators," Journal of Informetrics, Elsevier, vol. 11(2), pages 530-540.
    7. Bornmann, Lutz & Haunschild, Robin, 2016. "Citation score normalized by cited references (CSNCR): The introduction of a new citation impact indicator," Journal of Informetrics, Elsevier, vol. 10(3), pages 875-887.
    8. Bornmann, Lutz & Haunschild, Robin, 2016. "Normalization of Mendeley reader impact on the reader- and paper-side: A comparison of the mean discipline normalized reader score (MDNRS) with the mean normalized reader score (MNRS) and bare reader ," Journal of Informetrics, Elsevier, vol. 10(3), pages 776-788.
    9. Waltman, Ludo & van Eck, Nees Jan, 2013. "A systematic empirical comparison of different approaches for normalizing citation impact indicators," Journal of Informetrics, Elsevier, vol. 7(4), pages 833-849.
    10. Dunaiski, Marcel & Geldenhuys, Jaco & Visser, Willem, 2019. "On the interplay between normalisation, bias, and performance of paper impact metrics," Journal of Informetrics, Elsevier, vol. 13(1), pages 270-290.
    11. Li, Yunrong & Radicchi, Filippo & Castellano, Claudio & Ruiz-Castillo, Javier, 2013. "Quantitative evaluation of alternative field normalization procedures," Journal of Informetrics, Elsevier, vol. 7(3), pages 746-755.
    12. Mingers, John & Yang, Liying, 2017. "Evaluating journal quality: A review of journal citation indicators and ranking in business and management," European Journal of Operational Research, Elsevier, vol. 257(1), pages 323-337.
    13. Ruiz-Castillo, Javier & Waltman, Ludo, 2015. "Field-normalized citation impact indicators using algorithmically constructed classification systems of science," Journal of Informetrics, Elsevier, vol. 9(1), pages 102-117.
    14. Li, Yunrong & Ruiz-Castillo, Javier, 2013. "The comparison of normalization procedures based on different classification systems," Journal of Informetrics, Elsevier, vol. 7(4), pages 945-958.
    15. Thelwall, Mike, 2018. "Do females create higher impact research? Scopus citations and Mendeley readers for articles from five countries," Journal of Informetrics, Elsevier, vol. 12(4), pages 1031-1041.
    16. Zhihui Zhang & Ying Cheng & Nian Cai Liu, 2015. "Improving the normalization effect of mean-based method from the perspective of optimization: optimization-based linear methods and their performance," Scientometrics, Springer;Akadémiai Kiadó, vol. 102(1), pages 587-607, January.
    17. Rons, Nadine, 2012. "Partition-based Field Normalization: An approach to highly specialized publication records," Journal of Informetrics, Elsevier, vol. 6(1), pages 1-10.
    18. Cristian Colliander & Per Ahlgren, 2019. "Comparison of publication-level approaches to ex-post citation normalization," Scientometrics, Springer;Akadémiai Kiadó, vol. 120(1), pages 283-300, July.
    19. Tolga Yuret, 2018. "Author-weighted impact factor and reference return ratio: can we attain more equality among fields?," Scientometrics, Springer;Akadémiai Kiadó, vol. 116(3), pages 2097-2111, September.
    20. Haunschild, Robin & Bornmann, Lutz, 2016. "Normalization of Mendeley reader counts for impact assessment," Journal of Informetrics, Elsevier, vol. 10(1), pages 62-73.

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:eee:infome:v:18:y:2024:i:3:s1751157724000440. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: Catherine Liu (email available below). General contact details of provider: http://www.elsevier.com/locate/joi .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.