IDEAS home Printed from https://ideas.repec.org/a/spr/scient/v127y2022i4d10.1007_s11192-022-04291-z.html
   My bibliography  Save this article

A discussion of measuring the top-1% most-highly cited publications: quality and impact of Chinese papers

Author

Listed:
  • Caroline S. Wagner

    (The Ohio State University)

  • Lin Zhang

    (Wuhan University)

  • Loet Leydesdorff

    (University of Amsterdam)

Abstract

The top-1% most-highly-cited articles are watched closely as the vanguards of the sciences. Using Web of Science data, one can find that China had overtaken the USA in the relative participation in the top-1% (PP-top1%) in 2019, after outcompeting the EU on this indicator in 2015. However, this finding contrasts with repeated reports of Western agencies that the quality of China’s output in science is lagging other advanced nations, even as it has caught up in numbers of articles. The difference between the results presented here and the previous results depends mainly upon field normalizations, which classify source journals by discipline. Average citation rates of these subsets are commonly used as a baseline so that one can compare among disciplines. However, the expected value of the top-1% of a sample of N papers is N / 100, ceteris paribus. Using the average citation rates as expected values, errors are introduced by (1) using the mean of highly skewed distributions and (2) a specious precision in the delineations of the subsets. Classifications can be used for the decomposition, but not for the normalization. When the data is thus decomposed, the USA ranks ahead of China in biomedical fields such as virology. Although the number of papers is smaller, China outperforms the US in the field of Business and Finance (in the Social Sciences Citation Index; p

Suggested Citation

  • Caroline S. Wagner & Lin Zhang & Loet Leydesdorff, 2022. "A discussion of measuring the top-1% most-highly cited publications: quality and impact of Chinese papers," Scientometrics, Springer;Akadémiai Kiadó, vol. 127(4), pages 1825-1839, April.
  • Handle: RePEc:spr:scient:v:127:y:2022:i:4:d:10.1007_s11192-022-04291-z
    DOI: 10.1007/s11192-022-04291-z
    as

    Download full text from publisher

    File URL: http://link.springer.com/10.1007/s11192-022-04291-z
    File Function: Abstract
    Download Restriction: Access to the full text of the articles in this series is restricted.

    File URL: https://libkey.io/10.1007/s11192-022-04291-z?utm_source=ideas
    LibKey link: if access is restricted and if your library uses this service, LibKey will redirect you to where you can use your library subscription to access this item
    ---><---

    As the access to this document is restricted, you may want to search for a different version of it.

    References listed on IDEAS

    as
    1. Caroline S. Wagner & Loet Leydesdorff, 2012. "An Integrated Impact Indicator: A new definition of 'Impact' with policy relevance," Research Evaluation, Oxford University Press, vol. 21(3), pages 183-188, July.
    2. Leydesdorff, Loet & Wagner, Caroline S. & Bornmann, Lutz, 2014. "The European Union, China, and the United States in the top-1% and top-10% layers of most-frequently cited publications: Competition and collaborations," Journal of Informetrics, Elsevier, vol. 8(3), pages 606-617.
    3. Ludo Waltman & Nees Jan Eck, 2012. "A new methodology for constructing a publication-level classification system of science," Journal of the Association for Information Science & Technology, Association for Information Science & Technology, vol. 63(12), pages 2378-2392, December.
    4. Alexander I. Pudovkin & Eugene Garfield, 2002. "Algorithmic procedure for finding semantically related journals," Journal of the American Society for Information Science and Technology, Association for Information Science & Technology, vol. 53(13), pages 1113-1119, November.
    5. Loet Leydesdorff, 2006. "Can scientific journals be classified in terms of aggregated journal‐journal citation relations using the Journal Citation Reports?," Journal of the American Society for Information Science and Technology, Association for Information Science & Technology, vol. 57(5), pages 601-613, March.
    6. Sivertsen, Gunnar & Rousseau, Ronald & Zhang, Lin, 2019. "Measuring scientific contributions with modified fractional counting," Journal of Informetrics, Elsevier, vol. 13(2), pages 679-694.
    7. Loet Leydesdorff & Lutz Bornmann, 2011. "Integrated impact indicators compared with impact factors: An alternative research design with policy implications," Journal of the Association for Information Science & Technology, Association for Information Science & Technology, vol. 62(11), pages 2133-2146, November.
    8. Robert J. W. Tijssen & Martijn S. Visser & Thed N. van Leeuwen, 2002. "Benchmarking international scientific excellence: Are highly cited research papers an appropriate frame of reference?," Scientometrics, Springer;Akadémiai Kiadó, vol. 54(3), pages 381-397, July.
    9. Ismael Rafols & Loet Leydesdorff, 2009. "Content‐based and algorithmic classifications of journals: Perspectives on the dynamics of scientific communication and indexer effects," Journal of the American Society for Information Science and Technology, Association for Information Science & Technology, vol. 60(9), pages 1823-1835, September.
    10. Loet Leydesdorff & Lutz Bornmann, 2012. "Testing differences statistically with the Leiden ranking," Scientometrics, Springer;Akadémiai Kiadó, vol. 92(3), pages 781-783, September.
    11. Kevin W. Boyack & Richard Klavans & Katy Börner, 2005. "Mapping the backbone of science," Scientometrics, Springer;Akadémiai Kiadó, vol. 64(3), pages 351-374, August.
    12. Loet Leydesdorff & Lutz Bornmann & Jonathan Adams, 2019. "The integrated impact indicator revisited (I3*): a non-parametric alternative to the journal impact factor," Scientometrics, Springer;Akadémiai Kiadó, vol. 119(3), pages 1669-1694, June.
    13. Wolfgang Glänzel & András Schubert, 2003. "A new classification scheme of science fields and subfields designed for scientometric evaluation purposes," Scientometrics, Springer;Akadémiai Kiadó, vol. 56(3), pages 357-367, March.
    14. David A. King, 2004. "The scientific impact of nations," Nature, Nature, vol. 430(6997), pages 311-316, July.
    15. Loet Leydesdorff & Lutz Bornmann, 2016. "The operationalization of “fields” as WoS subject categories (WCs) in evaluative bibliometrics: The cases of “library and information science” and “science & technology studies”," Journal of the Association for Information Science & Technology, Association for Information Science & Technology, vol. 67(3), pages 707-714, March.
    16. Zhou, Ping & Leydesdorff, Loet, 2006. "The emergence of China as a leading nation in science," Research Policy, Elsevier, vol. 35(1), pages 83-104, February.
    Full references (including those not matched with items on IDEAS)

    Citations

    Citations are extracted by the CitEc Project, subscribe to its RSS feed for this item.
    as


    Cited by:

    1. Cinzia Daraio & Simone Di Leo & Loet Leydesdorff, 2023. "A heuristic approach based on Leiden rankings to identify outliers: evidence from Italian universities in the European landscape," Scientometrics, Springer;Akadémiai Kiadó, vol. 128(1), pages 483-510, January.
    2. Péter Vinkler, 2023. "Impact of the number and rank of coauthors on h-index and π-index. The part-impact method," Scientometrics, Springer;Akadémiai Kiadó, vol. 128(4), pages 2349-2369, April.

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Bar-Ilan, Judit, 2008. "Informetrics at the beginning of the 21st century—A review," Journal of Informetrics, Elsevier, vol. 2(1), pages 1-52.
    2. Waltman, Ludo, 2016. "A review of the literature on citation impact indicators," Journal of Informetrics, Elsevier, vol. 10(2), pages 365-391.
    3. Loet Leydesdorff & Paul Wouters & Lutz Bornmann, 2016. "Professional and citizen bibliometrics: complementarities and ambivalences in the development and use of indicators—a state-of-the-art report," Scientometrics, Springer;Akadémiai Kiadó, vol. 109(3), pages 2129-2150, December.
    4. Loet Leydesdorff & Lutz Bornmann & Caroline S. Wagner, 2017. "Generating clustered journal maps: an automated system for hierarchical classification," Scientometrics, Springer;Akadémiai Kiadó, vol. 110(3), pages 1601-1614, March.
    5. Roberto Camerani & Daniele Rotolo & Nicola Grassano, 2018. "Do Firms Publish? A Multi-Sectoral Analysis," SPRU Working Paper Series 2018-21, SPRU - Science Policy Research Unit, University of Sussex Business School.
    6. Wolfram, Dietmar & Zhao, Yuehua, 2014. "A comparison of journal similarity across six disciplines using citing discipline analysis," Journal of Informetrics, Elsevier, vol. 8(4), pages 840-853.
    7. Juan Miguel Campanario, 2018. "Are leaders really leading? Journals that are first in Web of Science subject categories in the context of their groups," Scientometrics, Springer;Akadémiai Kiadó, vol. 115(1), pages 111-130, April.
    8. Leydesdorff, Loet & Bornmann, Lutz & Zhou, Ping, 2016. "Construction of a pragmatic base line for journal classifications and maps based on aggregated journal-journal citation relations," Journal of Informetrics, Elsevier, vol. 10(4), pages 902-918.
    9. Loet Leydesdorff, 2012. "Alternatives to the journal impact factor: I3 and the top-10% (or top-25%?) of the most-highly cited papers," Scientometrics, Springer;Akadémiai Kiadó, vol. 92(2), pages 355-365, August.
    10. Sjögårde, Peter & Ahlgren, Per, 2018. "Granularity of algorithmically constructed publication-level classifications of research publications: Identification of topics," Journal of Informetrics, Elsevier, vol. 12(1), pages 133-152.
    11. Shu, Fei & Julien, Charles-Antoine & Zhang, Lin & Qiu, Junping & Zhang, Jing & Larivière, Vincent, 2019. "Comparing journal and paper level classifications of science," Journal of Informetrics, Elsevier, vol. 13(1), pages 202-225.
    12. Loet Leydesdorff & Caroline S. Wagner & Lutz Bornmann, 2018. "Betweenness and diversity in journal citation networks as measures of interdisciplinarity—A tribute to Eugene Garfield," Scientometrics, Springer;Akadémiai Kiadó, vol. 114(2), pages 567-592, February.
    13. Leydesdorff, Loet & Rafols, Ismael, 2012. "Interactive overlays: A new method for generating global journal maps from Web-of-Science data," Journal of Informetrics, Elsevier, vol. 6(2), pages 318-332.
    14. Rosa Rodriguez-Sánchez & J. A. García & J. Fdez-Valdivia, 2014. "Evolutionary games between subject categories," Scientometrics, Springer;Akadémiai Kiadó, vol. 101(1), pages 869-888, October.
    15. Li, Yunrong & Ruiz-Castillo, Javier, 2013. "The comparison of normalization procedures based on different classification systems," Journal of Informetrics, Elsevier, vol. 7(4), pages 945-958.
    16. Ricardo Arencibia-Jorge & Rosa Lidia Vega-Almeida & José Luis Jiménez-Andrade & Humberto Carrillo-Calvet, 2022. "Evolutionary stages and multidisciplinary nature of artificial intelligence research," Scientometrics, Springer;Akadémiai Kiadó, vol. 127(9), pages 5139-5158, September.
    17. Lutz Bornmann & Alexander Tekles & Loet Leydesdorff, 2019. "How well does I3 perform for impact measurement compared to other bibliometric indicators? The convergent validity of several (field-normalized) indicators," Scientometrics, Springer;Akadémiai Kiadó, vol. 119(2), pages 1187-1205, May.
    18. Ismael Rafols & Alan Porter & Loet Leydesdorff, 2009. "Overlay Maps of Science: a New Tool for Research Policy," SPRU Working Paper Series 179, SPRU - Science Policy Research Unit, University of Sussex Business School.
    19. Zhou, Ping & Leydesdorff, Loet, 2011. "Fractional counting of citations in research evaluation: A cross- and interdisciplinary assessment of the Tsinghua University in Beijing," Journal of Informetrics, Elsevier, vol. 5(3), pages 360-368.
    20. Loet Leydesdorff, 2013. "An evaluation of impacts in “Nanoscience & nanotechnology”: steps towards standards for citation analysis," Scientometrics, Springer;Akadémiai Kiadó, vol. 94(1), pages 35-55, January.

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:spr:scient:v:127:y:2022:i:4:d:10.1007_s11192-022-04291-z. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: Sonal Shukla or Springer Nature Abstracting and Indexing (email available below). General contact details of provider: http://www.springer.com .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.