IDEAS home Printed from https://ideas.repec.org/a/plo/pone00/0047210.html
   My bibliography  Save this article

Counting Highly Cited Papers for University Research Assessment: Conceptual and Technical Issues

Author

Listed:
  • Alonso Rodríguez-Navarro

Abstract

A Kuhnian approach to research assessment requires us to consider that the important scientific breakthroughs that drive scientific progress are infrequent and that the progress of science does not depend on normal research. Consequently, indicators of research performance based on the total number of papers do not accurately measure scientific progress. Similarly, those universities with the best reputations in terms of scientific progress differ widely from other universities in terms of the scale of investments made in research and in the higher concentrations of outstanding scientists present, but less so in terms of the total number of papers or citations. This study argues that indicators for the 1% high-citation tail of the citation distribution reveal the contribution of universities to the progress of science and provide quantifiable justification for the large investments in research made by elite research universities. In this tail, which follows a power low, the number of the less frequent and highly cited important breakthroughs can be predicted from the frequencies of papers in the upper part of the tail. This study quantifies the false impression of excellence produced by multinational papers, and by other types of papers that do not contribute to the progress of science. Many of these papers are concentrated in and dominate lists of highly cited papers, especially in lower-ranked universities. The h-index obscures the differences between higher- and lower-ranked universities because the proportion of h-core papers in the 1% high-citation tail is not proportional to the value of the h-index.

Suggested Citation

  • Alonso Rodríguez-Navarro, 2012. "Counting Highly Cited Papers for University Research Assessment: Conceptual and Technical Issues," PLOS ONE, Public Library of Science, vol. 7(10), pages 1-10, October.
  • Handle: RePEc:plo:pone00:0047210
    DOI: 10.1371/journal.pone.0047210
    as

    Download full text from publisher

    File URL: https://journals.plos.org/plosone/article?id=10.1371/journal.pone.0047210
    Download Restriction: no

    File URL: https://journals.plos.org/plosone/article/file?id=10.1371/journal.pone.0047210&type=printable
    Download Restriction: no

    File URL: https://libkey.io/10.1371/journal.pone.0047210?utm_source=ideas
    LibKey link: if access is restricted and if your library uses this service, LibKey will redirect you to where you can use your library subscription to access this item
    ---><---

    References listed on IDEAS

    as
    1. Giovanni Abramo & Ciriaco Andrea D’Angelo, 2011. "Evaluating research: from informed peer review to bibliometrics," Scientometrics, Springer;Akadémiai Kiadó, vol. 87(3), pages 499-514, June.
    2. Bruno S. Frey & Katja Rost, 2010. "Do rankings reflect research quality?," Journal of Applied Economics, Universidad del CEMA, vol. 13, pages 1-38, May.
    3. Kostoff, Ronald N. & Geisler, Elie, 2007. "The unintended consequences of metrics in technology evaluation," Journal of Informetrics, Elsevier, vol. 1(2), pages 103-114.
    Full references (including those not matched with items on IDEAS)

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Silvia Sacchetti, 2013. "Motivational resilience in the university system," Chapters, in: Roger Sugden & Marcela Valania & James R. Wilson (ed.), Leadership and Cooperation in Academia, chapter 8, pages 107-127, Edward Elgar Publishing.
    2. Klaus Wohlrabe, 2018. "Einige Anmerkungen zum FAZ-Ökonomenranking 2018," ifo Schnelldienst, ifo Institute - Leibniz Institute for Economic Research at the University of Munich, vol. 71(20), pages 29-33, October.
    3. Müller, Harry, 2012. "Die Zitationshäufigkeit als Qualitätsindikator im Rahmen der Forschungsleistungsmessung," Discussion Papers of the Institute for Organisational Economics 1/2012, University of Münster, Institute for Organisational Economics.
    4. Gebhard Kirchgässner, 2011. "Kaderschmieden der Wirtschaft und/oder Universitäten? Der Auftrag der Wirtschaftsuniversitäten und –fakultäten im 21. Jahrhundert," Perspektiven der Wirtschaftspolitik, Verein für Socialpolitik, vol. 12(3), pages 317-337, August.
    5. Justus Haucap & Johannes Muck, 2015. "What drives the relevance and reputation of economics journals? An update from a survey among economists," Scientometrics, Springer;Akadémiai Kiadó, vol. 103(3), pages 849-877, June.
    6. Daraio, Cinzia & Bonaccorsi, Andrea & Geuna, Aldo & Lepori, Benedetto & Bach, Laurent & Bogetoft, Peter & F. Cardoso, Margarida & Castro-Martinez, Elena & Crespi, Gustavo & de Lucio, Ignacio Fernandez, 2011. "The European university landscape: A micro characterization based on evidence from the Aquameth project," Research Policy, Elsevier, vol. 40(1), pages 148-164, February.
    7. Olugbenga Oladinrin & Kasun Gomis & Wadu Mesthrige Jayantha & Lovelin Obi & Muhammad Qasim Rana, 2021. "Scientometric Analysis of Global Scientific Literature on Aging in Place," IJERPH, MDPI, vol. 18(23), pages 1-16, November.
    8. Nadeem Shafique Butt & Ahmad Azam Malik & Muhammad Qaiser Shahbaz, 2021. "Bibliometric Analysis of Statistics Journals Indexed in Web of Science Under Emerging Source Citation Index," SAGE Open, , vol. 11(1), pages 21582440209, January.
    9. Eliseo Reategui & Alause Pires & Michel Carniato & Sergio Roberto Kieling Franco, 2020. "Evaluation of Brazilian research output in education: confronting international and national contexts," Scientometrics, Springer;Akadémiai Kiadó, vol. 125(1), pages 427-444, October.
    10. Nikolaos A. Kazakis, 2014. "Bibliometric evaluation of the research performance of the Greek civil engineering departments in National and European context," Scientometrics, Springer;Akadémiai Kiadó, vol. 101(1), pages 505-525, October.
    11. Xie, Yundong & Wu, Qiang & Zhang, Peng & Li, Xingchen, 2020. "Information Science and Library Science (IS-LS) journal subject categorisation and comparison based on editorship information," Journal of Informetrics, Elsevier, vol. 14(4).
    12. Hamid Bouabid & Hind Achachi, 2022. "Size of science team at university and internal co-publications: science policy implications," Scientometrics, Springer;Akadémiai Kiadó, vol. 127(12), pages 6993-7013, December.
    13. Lutz Bornmann & Werner Marx, 2014. "How to evaluate individual researchers working in the natural and life sciences meaningfully? A proposal of methods based on percentiles of citations," Scientometrics, Springer;Akadémiai Kiadó, vol. 98(1), pages 487-509, January.
    14. Gennaro Guida, 2018. "Italian Economics Departments’ Scientific Research Performance: Comparison between VQR and ASN Methodologies," International Journal of Business and Management, Canadian Center of Science and Education, vol. 13(9), pages 182-182, August.
    15. Wildgaard, Lorna, 2016. "A critical cluster analysis of 44 indicators of author-level performance," Journal of Informetrics, Elsevier, vol. 10(4), pages 1055-1078.
    16. Woiwode, Hendrik, 2020. "Scholars as government-appointed research evaluators: Do they create congruence between their professional quality standards and political demands?," EconStor Open Access Articles and Book Chapters, ZBW - Leibniz Information Centre for Economics, vol. 15(10), pages 1-1.
    17. Oriana Gava & Fabio Bartolini & Francesca Venturi & Gianluca Brunori & Alberto Pardossi, 2020. "Improving Policy Evidence Base for Agricultural Sustainability and Food Security: A Content Analysis of Life Cycle Assessment Research," Sustainability, MDPI, vol. 12(3), pages 1-29, February.
    18. Mingkun Wei, 2020. "Research on impact evaluation of open access journals," Scientometrics, Springer;Akadémiai Kiadó, vol. 122(2), pages 1027-1049, February.
    19. Hendrik Woiwode, 2020. "Scholars as government-appointed research evaluators: Do they create congruence between their professional quality standards and political demands?," PLOS ONE, Public Library of Science, vol. 15(10), pages 1-14, October.
    20. Franc Mali, 2013. "Why an Unbiased External R&D Evaluation System is Important for the Progress of Social Sciences—the Case of a Small Social Science Community," Social Sciences, MDPI, vol. 2(4), pages 1-14, December.

    More about this item

    Statistics

    Access and download statistics

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:plo:pone00:0047210. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: plosone (email available below). General contact details of provider: https://journals.plos.org/plosone/ .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.