IDEAS home Printed from https://ideas.repec.org/a/spr/scient/v127y2022i4d10.1007_s11192-022-04291-z.html
   My bibliography  Save this article

A discussion of measuring the top-1% most-highly cited publications: quality and impact of Chinese papers

Author

Listed:
  • Caroline S. Wagner

    (The Ohio State University)

  • Lin Zhang

    (Wuhan University)

  • Loet Leydesdorff

    (University of Amsterdam)

Abstract

The top-1% most-highly-cited articles are watched closely as the vanguards of the sciences. Using Web of Science data, one can find that China had overtaken the USA in the relative participation in the top-1% (PP-top1%) in 2019, after outcompeting the EU on this indicator in 2015. However, this finding contrasts with repeated reports of Western agencies that the quality of China’s output in science is lagging other advanced nations, even as it has caught up in numbers of articles. The difference between the results presented here and the previous results depends mainly upon field normalizations, which classify source journals by discipline. Average citation rates of these subsets are commonly used as a baseline so that one can compare among disciplines. However, the expected value of the top-1% of a sample of N papers is N / 100, ceteris paribus. Using the average citation rates as expected values, errors are introduced by (1) using the mean of highly skewed distributions and (2) a specious precision in the delineations of the subsets. Classifications can be used for the decomposition, but not for the normalization. When the data is thus decomposed, the USA ranks ahead of China in biomedical fields such as virology. Although the number of papers is smaller, China outperforms the US in the field of Business and Finance (in the Social Sciences Citation Index; p

Suggested Citation

  • Caroline S. Wagner & Lin Zhang & Loet Leydesdorff, 2022. "A discussion of measuring the top-1% most-highly cited publications: quality and impact of Chinese papers," Scientometrics, Springer;Akadémiai Kiadó, vol. 127(4), pages 1825-1839, April.
  • Handle: RePEc:spr:scient:v:127:y:2022:i:4:d:10.1007_s11192-022-04291-z
    DOI: 10.1007/s11192-022-04291-z
    as

    Download full text from publisher

    File URL: http://link.springer.com/10.1007/s11192-022-04291-z
    File Function: Abstract
    Download Restriction: Access to the full text of the articles in this series is restricted.

    File URL: https://libkey.io/10.1007/s11192-022-04291-z?utm_source=ideas
    LibKey link: if access is restricted and if your library uses this service, LibKey will redirect you to where you can use your library subscription to access this item
    ---><---

    As the access to this document is restricted, you may want to search for a different version of it.

    References listed on IDEAS

    as
    1. Caroline S. Wagner & Loet Leydesdorff, 2012. "An Integrated Impact Indicator: A new definition of 'Impact' with policy relevance," Research Evaluation, Oxford University Press, vol. 21(3), pages 183-188, July.
    2. Loet Leydesdorff, 2006. "Can scientific journals be classified in terms of aggregated journal‐journal citation relations using the Journal Citation Reports?," Journal of the American Society for Information Science and Technology, Association for Information Science & Technology, vol. 57(5), pages 601-613, March.
    3. Leydesdorff, Loet & Wagner, Caroline S. & Bornmann, Lutz, 2014. "The European Union, China, and the United States in the top-1% and top-10% layers of most-frequently cited publications: Competition and collaborations," Journal of Informetrics, Elsevier, vol. 8(3), pages 606-617.
    4. Loet Leydesdorff & Lutz Bornmann, 2016. "The operationalization of “fields” as WoS subject categories (WCs) in evaluative bibliometrics: The cases of “library and information science” and “science & technology studies”," Journal of the Association for Information Science & Technology, Association for Information Science & Technology, vol. 67(3), pages 707-714, March.
    5. Kevin W. Boyack & Richard Klavans & Katy Börner, 2005. "Mapping the backbone of science," Scientometrics, Springer;Akadémiai Kiadó, vol. 64(3), pages 351-374, August.
    6. Loet Leydesdorff & Lutz Bornmann, 2011. "Integrated impact indicators compared with impact factors: An alternative research design with policy implications," Journal of the American Society for Information Science and Technology, Association for Information Science & Technology, vol. 62(11), pages 2133-2146, November.
    7. Sivertsen, Gunnar & Rousseau, Ronald & Zhang, Lin, 2019. "Measuring scientific contributions with modified fractional counting," Journal of Informetrics, Elsevier, vol. 13(2), pages 679-694.
    8. Robert J. W. Tijssen & Martijn S. Visser & Thed N. van Leeuwen, 2002. "Benchmarking international scientific excellence: Are highly cited research papers an appropriate frame of reference?," Scientometrics, Springer;Akadémiai Kiadó, vol. 54(3), pages 381-397, July.
    9. Loet Leydesdorff & Lutz Bornmann & Jonathan Adams, 2019. "The integrated impact indicator revisited (I3*): a non-parametric alternative to the journal impact factor," Scientometrics, Springer;Akadémiai Kiadó, vol. 119(3), pages 1669-1694, June.
    10. Ismael Rafols & Loet Leydesdorff, 2009. "Content‐based and algorithmic classifications of journals: Perspectives on the dynamics of scientific communication and indexer effects," Journal of the American Society for Information Science and Technology, Association for Information Science & Technology, vol. 60(9), pages 1823-1835, September.
    11. Ludo Waltman & Nees Jan Eck, 2012. "A new methodology for constructing a publication-level classification system of science," Journal of the Association for Information Science & Technology, Association for Information Science & Technology, vol. 63(12), pages 2378-2392, December.
    12. Zhou, Ping & Leydesdorff, Loet, 2006. "The emergence of China as a leading nation in science," Research Policy, Elsevier, vol. 35(1), pages 83-104, February.
    13. Loet Leydesdorff & Lutz Bornmann, 2012. "Testing differences statistically with the Leiden ranking," Scientometrics, Springer;Akadémiai Kiadó, vol. 92(3), pages 781-783, September.
    14. Alexander I. Pudovkin & Eugene Garfield, 2002. "Algorithmic procedure for finding semantically related journals," Journal of the American Society for Information Science and Technology, Association for Information Science & Technology, vol. 53(13), pages 1113-1119, November.
    15. Wolfgang Glänzel & András Schubert, 2003. "A new classification scheme of science fields and subfields designed for scientometric evaluation purposes," Scientometrics, Springer;Akadémiai Kiadó, vol. 56(3), pages 357-367, March.
    16. David A. King, 2004. "The scientific impact of nations," Nature, Nature, vol. 430(6997), pages 311-316, July.
    Full references (including those not matched with items on IDEAS)

    Citations

    Citations are extracted by the CitEc Project, subscribe to its RSS feed for this item.
    as


    Cited by:

    1. Cinzia Daraio & Simone Di Leo & Loet Leydesdorff, 2023. "A heuristic approach based on Leiden rankings to identify outliers: evidence from Italian universities in the European landscape," Scientometrics, Springer;Akadémiai Kiadó, vol. 128(1), pages 483-510, January.
    2. Péter Vinkler, 2023. "Impact of the number and rank of coauthors on h-index and π-index. The part-impact method," Scientometrics, Springer;Akadémiai Kiadó, vol. 128(4), pages 2349-2369, April.
    3. Ziyou Teng & Xuezhong Zhu, 2024. "Measuring the global and domestic technological impact of Chinese scientific output: a patent-to-paper citation analysis of science-technology linkage," Scientometrics, Springer;Akadémiai Kiadó, vol. 129(9), pages 5181-5210, September.

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Bar-Ilan, Judit, 2008. "Informetrics at the beginning of the 21st century—A review," Journal of Informetrics, Elsevier, vol. 2(1), pages 1-52.
    2. Waltman, Ludo, 2016. "A review of the literature on citation impact indicators," Journal of Informetrics, Elsevier, vol. 10(2), pages 365-391.
    3. Loet Leydesdorff & Paul Wouters & Lutz Bornmann, 2016. "Professional and citizen bibliometrics: complementarities and ambivalences in the development and use of indicators—a state-of-the-art report," Scientometrics, Springer;Akadémiai Kiadó, vol. 109(3), pages 2129-2150, December.
    4. Loet Leydesdorff & Lutz Bornmann & Caroline S. Wagner, 2017. "Generating clustered journal maps: an automated system for hierarchical classification," Scientometrics, Springer;Akadémiai Kiadó, vol. 110(3), pages 1601-1614, March.
    5. Juan Miguel Campanario, 2018. "Are leaders really leading? Journals that are first in Web of Science subject categories in the context of their groups," Scientometrics, Springer;Akadémiai Kiadó, vol. 115(1), pages 111-130, April.
    6. Roberto Camerani & Daniele Rotolo & Nicola Grassano, 2018. "Do Firms Publish? A Multi-Sectoral Analysis," SPRU Working Paper Series 2018-21, SPRU - Science Policy Research Unit, University of Sussex Business School.
    7. Wolfram, Dietmar & Zhao, Yuehua, 2014. "A comparison of journal similarity across six disciplines using citing discipline analysis," Journal of Informetrics, Elsevier, vol. 8(4), pages 840-853.
    8. Leydesdorff, Loet & Bornmann, Lutz & Zhou, Ping, 2016. "Construction of a pragmatic base line for journal classifications and maps based on aggregated journal-journal citation relations," Journal of Informetrics, Elsevier, vol. 10(4), pages 902-918.
    9. Loet Leydesdorff, 2012. "Alternatives to the journal impact factor: I3 and the top-10% (or top-25%?) of the most-highly cited papers," Scientometrics, Springer;Akadémiai Kiadó, vol. 92(2), pages 355-365, August.
    10. Sjögårde, Peter & Ahlgren, Per, 2018. "Granularity of algorithmically constructed publication-level classifications of research publications: Identification of topics," Journal of Informetrics, Elsevier, vol. 12(1), pages 133-152.
    11. Shu, Fei & Julien, Charles-Antoine & Zhang, Lin & Qiu, Junping & Zhang, Jing & Larivière, Vincent, 2019. "Comparing journal and paper level classifications of science," Journal of Informetrics, Elsevier, vol. 13(1), pages 202-225.
    12. Loet Leydesdorff & Caroline S. Wagner & Lutz Bornmann, 2018. "Betweenness and diversity in journal citation networks as measures of interdisciplinarity—A tribute to Eugene Garfield," Scientometrics, Springer;Akadémiai Kiadó, vol. 114(2), pages 567-592, February.
    13. Leydesdorff, Loet & Rafols, Ismael, 2012. "Interactive overlays: A new method for generating global journal maps from Web-of-Science data," Journal of Informetrics, Elsevier, vol. 6(2), pages 318-332.
    14. Xie, Yundong & Wu, Qiang & Zhang, Peng & Li, Xingchen, 2020. "Information Science and Library Science (IS-LS) journal subject categorisation and comparison based on editorship information," Journal of Informetrics, Elsevier, vol. 14(4).
    15. Rosa Rodriguez-Sánchez & J. A. García & J. Fdez-Valdivia, 2014. "Evolutionary games between subject categories," Scientometrics, Springer;Akadémiai Kiadó, vol. 101(1), pages 869-888, October.
    16. Lin Zhang & Beibei Sun & Fei Shu & Ying Huang, 2022. "Comparing paper level classifications across different methods and systems: an investigation of Nature publications," Scientometrics, Springer;Akadémiai Kiadó, vol. 127(12), pages 7633-7651, December.
    17. Jielan Ding & Per Ahlgren & Liying Yang & Ting Yue, 2018. "Disciplinary structures in Nature, Science and PNAS: journal and country levels," Scientometrics, Springer;Akadémiai Kiadó, vol. 116(3), pages 1817-1852, September.
    18. Li, Yunrong & Ruiz-Castillo, Javier, 2013. "The comparison of normalization procedures based on different classification systems," Journal of Informetrics, Elsevier, vol. 7(4), pages 945-958.
    19. Ricardo Arencibia-Jorge & Rosa Lidia Vega-Almeida & José Luis Jiménez-Andrade & Humberto Carrillo-Calvet, 2022. "Evolutionary stages and multidisciplinary nature of artificial intelligence research," Scientometrics, Springer;Akadémiai Kiadó, vol. 127(9), pages 5139-5158, September.
    20. Lutz Bornmann & Alexander Tekles & Loet Leydesdorff, 2019. "How well does I3 perform for impact measurement compared to other bibliometric indicators? The convergent validity of several (field-normalized) indicators," Scientometrics, Springer;Akadémiai Kiadó, vol. 119(2), pages 1187-1205, May.

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:spr:scient:v:127:y:2022:i:4:d:10.1007_s11192-022-04291-z. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: Sonal Shukla or Springer Nature Abstracting and Indexing (email available below). General contact details of provider: http://www.springer.com .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.