IDEAS home Printed from https://ideas.repec.org/a/spr/scient/v129y2024i10d10.1007_s11192-024-05159-0.html
   My bibliography  Save this article

Errors of measurement in scientometrics: classification schemes and document types in citation and publication rankings

Author

Listed:
  • Nicolas Robinson-Garcia

    (University of Granada)

  • Benjamín Vargas-Quesada

    (University of Granada)

  • Daniel Torres-Salinas

    (University of Granada)

  • Zaida Chinchilla-Rodríguez

    (Consejo Superior de Investigaciones Científicas (CSIC))

  • Juan Gorraiz

    (University of Vienna)

Abstract

This research article delves into methodological challenges in scientometrics, focusing on errors stemming from the selection of classification schemes and document types. Employing two case studies, we examine the impact of these methodological choices on publication and citation rankings of institutions. We compute seven bibliometric indicators for over 8434 institutions using 23 different classification schemes derived from Clarivate’s InCites suite, as well as including all document types versus only citable items. Given the critical role university rankings play in research management and their methodological controversies, our goal is to propose a methodology that incorporates uncertainty levels when reporting bibliometric performance in professional practice. We then delve into differences in error estimates within research fields as well as between institutions from different geographic regions. The findings underscore the importance of responsible metric use in research evaluation, providing valuable insights for both bibliometricians and consumers of such data.

Suggested Citation

  • Nicolas Robinson-Garcia & Benjamín Vargas-Quesada & Daniel Torres-Salinas & Zaida Chinchilla-Rodríguez & Juan Gorraiz, 2024. "Errors of measurement in scientometrics: classification schemes and document types in citation and publication rankings," Scientometrics, Springer;Akadémiai Kiadó, vol. 129(10), pages 6455-6475, October.
  • Handle: RePEc:spr:scient:v:129:y:2024:i:10:d:10.1007_s11192-024-05159-0
    DOI: 10.1007/s11192-024-05159-0
    as

    Download full text from publisher

    File URL: http://link.springer.com/10.1007/s11192-024-05159-0
    File Function: Abstract
    Download Restriction: Access to the full text of the articles in this series is restricted.

    File URL: https://libkey.io/10.1007/s11192-024-05159-0?utm_source=ideas
    LibKey link: if access is restricted and if your library uses this service, LibKey will redirect you to where you can use your library subscription to access this item
    ---><---

    As the access to this document is restricted, you may want to search for a different version of it.

    References listed on IDEAS

    as
    1. Nicolás Robinson-García & Clara Calero-Medina, 2014. "What do university rankings by fields rank? Exploring discrepancies between the organizational structure of universities and bibliometric classifications," Scientometrics, Springer;Akadémiai Kiadó, vol. 98(3), pages 1955-1970, March.
    2. Ludo Waltman & Clara Calero‐Medina & Joost Kosten & Ed C.M. Noyons & Robert J.W. Tijssen & Nees Jan van Eck & Thed N. van Leeuwen & Anthony F.J. van Raan & Martijn S. Visser & Paul Wouters, 2012. "The Leiden ranking 2011/2012: Data collection, indicators, and interpretation," Journal of the American Society for Information Science and Technology, Association for Information Science & Technology, vol. 63(12), pages 2419-2432, December.
    3. Bart Thijs & Lin Zhang & Wolfgang Glänzel, 2015. "Bibliographic coupling and hierarchical clustering for the validation and improvement of subject-classification schemes," Scientometrics, Springer;Akadémiai Kiadó, vol. 105(3), pages 1453-1467, December.
    4. Wang, Qi & Waltman, Ludo, 2016. "Large-scale analysis of the accuracy of the journal classification systems of Web of Science and Scopus," Journal of Informetrics, Elsevier, vol. 10(2), pages 347-364.
    5. Stephan Stahlschmidt & Dimity Stephen, 2022. "From indexation policies through citation networks to normalized citation impacts: Web of Science, Scopus, and Dimensions as varying resonance chambers," Scientometrics, Springer;Akadémiai Kiadó, vol. 127(5), pages 2413-2431, May.
    6. Daniel Torres-Salinas & Wenceslao Arroyo-Machado & Nicolas Robinson-Garcia, 2023. "Bibliometric denialism," Scientometrics, Springer;Akadémiai Kiadó, vol. 128(9), pages 5357-5359, September.
    7. Henk F. Moed, 2008. "UK Research Assessment Exercises: Informed judgments on research quality or quantity?," Scientometrics, Springer;Akadémiai Kiadó, vol. 74(1), pages 153-161, January.
    8. Fiorenzo Franceschini & Domenico Maisano & Luca Mastrogiacomo, 2015. "Errors in DOI indexing by bibliometric databases," Scientometrics, Springer;Akadémiai Kiadó, vol. 102(3), pages 2181-2186, March.
    9. Ludo Waltman & Nees Jan Eck, 2012. "A new methodology for constructing a publication-level classification system of science," Journal of the Association for Information Science & Technology, Association for Information Science & Technology, vol. 63(12), pages 2378-2392, December.
    10. Alexander I. Pudovkin & Eugene Garfield, 2002. "Algorithmic procedure for finding semantically related journals," Journal of the American Society for Information Science and Technology, Association for Information Science & Technology, vol. 53(13), pages 1113-1119, November.
    11. Christian Schloegl & Juan Gorraiz, 2010. "Comparison of citation and usage indicators: the case of oncology journals," Scientometrics, Springer;Akadémiai Kiadó, vol. 82(3), pages 567-580, March.
    12. Muñoz-Écija, Teresa & Vargas-Quesada, Benjamín & Chinchilla Rodríguez, Zaida, 2019. "Coping with methods for delineating emerging fields: Nanoscience and nanotechnology as a case study," Journal of Informetrics, Elsevier, vol. 13(4).
    13. Gómez-Núñez, Antonio J. & Batagelj, Vladimir & Vargas-Quesada, Benjamín & Moya-Anegón, Félix & Chinchilla-Rodríguez, Zaida, 2014. "Optimizing SCImago Journal & Country Rank classification by community detection," Journal of Informetrics, Elsevier, vol. 8(2), pages 369-383.
    14. Ludo Waltman & Clara Calero-Medina & Joost Kosten & Ed C.M. Noyons & Robert J.W. Tijssen & Nees Jan Eck & Thed N. Leeuwen & Anthony F.J. Raan & Martijn S. Visser & Paul Wouters, 2012. "The Leiden ranking 2011/2012: Data collection, indicators, and interpretation," Journal of the Association for Information Science & Technology, Association for Information Science & Technology, vol. 63(12), pages 2419-2432, December.
    15. Antonio Perianes‐Rodriguez & Javier Ruiz‐Castillo, 2018. "The impact of classification systems in the evaluation of the research performance of the Leiden Ranking universities," Journal of the Association for Information Science & Technology, Association for Information Science & Technology, vol. 69(8), pages 1046-1053, August.
    16. Maxime Rivest & Etienne Vignola-Gagné & Éric Archambault, 2021. "Article-level classification of scientific publications: A comparison of deep learning, direct citation and bibliographic coupling," PLOS ONE, Public Library of Science, vol. 16(5), pages 1-18, May.
    17. Elizabeth Gadd, 2020. "University rankings need a rethink," Nature, Nature, vol. 587(7835), pages 523-523, November.
    18. Gorraiz, Juan & Melero-Fuentes, David & Gumpenberger, Christian & Valderrama-Zurián, Juan-Carlos, 2016. "Availability of digital object identifiers (DOIs) in Web of Science and Scopus," Journal of Informetrics, Elsevier, vol. 10(1), pages 98-109.
    19. Robin Haunschild & Lutz Bornmann, 2022. "Relevance of document types in the scores’ calculation of a specific field-normalized indicator: Are the scores strongly dependent on or nearly independent of the document type handling?," Scientometrics, Springer;Akadémiai Kiadó, vol. 127(8), pages 4419-4438, August.
    20. Paul Donner, 2017. "Document type assignment accuracy in the journal citation index data of Web of Science," Scientometrics, Springer;Akadémiai Kiadó, vol. 113(1), pages 219-236, October.
    21. Shu, Fei & Julien, Charles-Antoine & Zhang, Lin & Qiu, Junping & Zhang, Jing & Larivière, Vincent, 2019. "Comparing journal and paper level classifications of science," Journal of Informetrics, Elsevier, vol. 13(1), pages 202-225.
    22. Loet Leydesdorff & Paul Wouters & Lutz Bornmann, 2016. "Professional and citizen bibliometrics: complementarities and ambivalences in the development and use of indicators—a state-of-the-art report," Scientometrics, Springer;Akadémiai Kiadó, vol. 109(3), pages 2129-2150, December.
    23. Diana Hicks, 1999. "The difficulty of achieving full coverage of international social science literature and the bibliometric consequences," Scientometrics, Springer;Akadémiai Kiadó, vol. 44(2), pages 193-215, February.
    24. Ruiz-Castillo, Javier & Waltman, Ludo, 2015. "Field-normalized citation impact indicators using algorithmically constructed classification systems of science," Journal of Informetrics, Elsevier, vol. 9(1), pages 102-117.
    25. Björn Hammarfelt & Alexander D. Rushforth, 2017. "Indicators as judgment devices: An empirical study of citizen bibliometrics in research evaluation," Research Evaluation, Oxford University Press, vol. 26(3), pages 169-180.
    26. Thed N. Van Leeuwen & Henk F. Moed & Robert J. W. Tijssen & Martijn S. Visser & Anthony F. J. Van Raan, 2001. "Language biases in the coverage of the Science Citation Index and its consequencesfor international comparisons of national research performance," Scientometrics, Springer;Akadémiai Kiadó, vol. 51(1), pages 335-346, April.
    27. Franceschini, Fiorenzo & Maisano, Domenico & Mastrogiacomo, Luca, 2016. "Empirical analysis and classification of database errors in Scopus and Web of Science," Journal of Informetrics, Elsevier, vol. 10(4), pages 933-953.
    28. Michael Gusenbauer, 2022. "Search where you will find most: Comparing the disciplinary coverage of 56 bibliographic databases," Scientometrics, Springer;Akadémiai Kiadó, vol. 127(5), pages 2683-2745, May.
    29. Petr Heneberg, 2014. "Parallel worlds of citable documents and others: Inflated commissioned opinion articles enhance scientometric indicators," Journal of the Association for Information Science & Technology, Association for Information Science & Technology, vol. 65(3), pages 635-643, March.
    Full references (including those not matched with items on IDEAS)

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Waltman, Ludo, 2016. "A review of the literature on citation impact indicators," Journal of Informetrics, Elsevier, vol. 10(2), pages 365-391.
    2. Mike Thelwall, 2019. "The influence of highly cited papers on field normalised indicators," Scientometrics, Springer;Akadémiai Kiadó, vol. 118(2), pages 519-537, February.
    3. Raminta Pranckutė, 2021. "Web of Science (WoS) and Scopus: The Titans of Bibliographic Information in Today’s Academic World," Publications, MDPI, vol. 9(1), pages 1-59, March.
    4. Wang, Qi & Waltman, Ludo, 2016. "Large-scale analysis of the accuracy of the journal classification systems of Web of Science and Scopus," Journal of Informetrics, Elsevier, vol. 10(2), pages 347-364.
    5. Gerson Pech & Catarina Delgado & Silvio Paolo Sorella, 2022. "Classifying papers into subfields using Abstracts, Titles, Keywords and KeyWords Plus through pattern detection and optimization procedures: An application in Physics," Journal of the Association for Information Science & Technology, Association for Information Science & Technology, vol. 73(11), pages 1513-1528, November.
    6. Loet Leydesdorff & Paul Wouters & Lutz Bornmann, 2016. "Professional and citizen bibliometrics: complementarities and ambivalences in the development and use of indicators—a state-of-the-art report," Scientometrics, Springer;Akadémiai Kiadó, vol. 109(3), pages 2129-2150, December.
    7. Gabriel Alves Vieira & Jacqueline Leta, 2024. "biblioverlap: an R package for document matching across bibliographic datasets," Scientometrics, Springer;Akadémiai Kiadó, vol. 129(7), pages 4513-4527, July.
    8. Yu-Wei Chang, 2019. "Are articles in library and information science (LIS) journals primarily contributed to by LIS authors?," Scientometrics, Springer;Akadémiai Kiadó, vol. 121(1), pages 81-104, October.
    9. Shirley Ainsworth & Jane M. Russell, 2018. "Has hosting on science direct improved the visibility of Latin American scholarly journals? A preliminary analysis of data quality," Scientometrics, Springer;Akadémiai Kiadó, vol. 115(3), pages 1463-1484, June.
    10. Michael Gusenbauer, 2022. "Search where you will find most: Comparing the disciplinary coverage of 56 bibliographic databases," Scientometrics, Springer;Akadémiai Kiadó, vol. 127(5), pages 2683-2745, May.
    11. Gabriel-Alexandru Vîiu & Mihai Păunescu, 2021. "The citation impact of articles from which authors gained monetary rewards based on journal metrics," Scientometrics, Springer;Akadémiai Kiadó, vol. 126(6), pages 4941-4974, June.
    12. Muñoz-Écija, Teresa & Vargas-Quesada, Benjamín & Chinchilla Rodríguez, Zaida, 2019. "Coping with methods for delineating emerging fields: Nanoscience and nanotechnology as a case study," Journal of Informetrics, Elsevier, vol. 13(4).
    13. Urdiales, Cristina & Guzmán, Eduardo, 2024. "An automatic and association-based procedure for hierarchical publication subject categorization," Journal of Informetrics, Elsevier, vol. 18(1).
    14. Perianes-Rodriguez, Antonio & Ruiz-Castillo, Javier, 2017. "A comparison of the Web of Science and publication-level classification systems of science," Journal of Informetrics, Elsevier, vol. 11(1), pages 32-45.
    15. Mike Thelwall, 2017. "Are Mendeley reader counts useful impact indicators in all fields?," Scientometrics, Springer;Akadémiai Kiadó, vol. 113(3), pages 1721-1731, December.
    16. Bornmann, Lutz & Haunschild, Robin, 2022. "Empirical analysis of recent temporal dynamics of research fields: Annual publications in chemistry and related areas as an example," Journal of Informetrics, Elsevier, vol. 16(2).
    17. Roberto Camerani & Daniele Rotolo & Nicola Grassano, 2018. "Do Firms Publish? A Multi-Sectoral Analysis," SPRU Working Paper Series 2018-21, SPRU - Science Policy Research Unit, University of Sussex Business School.
    18. Gabriele Sampagnaro, 2023. "Keyword occurrences and journal specialization," Scientometrics, Springer;Akadémiai Kiadó, vol. 128(10), pages 5629-5645, October.
    19. Ricardo Arencibia-Jorge & Rosa Lidia Vega-Almeida & José Luis Jiménez-Andrade & Humberto Carrillo-Calvet, 2022. "Evolutionary stages and multidisciplinary nature of artificial intelligence research," Scientometrics, Springer;Akadémiai Kiadó, vol. 127(9), pages 5139-5158, September.
    20. Bornmann, Lutz & Haunschild, Robin & Mutz, Rüdiger, 2020. "Should citations be field-normalized in evaluative bibliometrics? An empirical analysis based on propensity score matching," Journal of Informetrics, Elsevier, vol. 14(4).

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:spr:scient:v:129:y:2024:i:10:d:10.1007_s11192-024-05159-0. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: Sonal Shukla or Springer Nature Abstracting and Indexing (email available below). General contact details of provider: http://www.springer.com .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.