IDEAS home Printed from https://ideas.repec.org/a/plo/pone00/0193762.html
   My bibliography  Save this article

Are university rankings useful to improve research? A systematic review

Author

Listed:
  • Marlo M Vernon
  • E Andrew Balas
  • Shaher Momani

Abstract

Introduction: Concerns about reproducibility and impact of research urge improvement initiatives. Current university ranking systems evaluate and compare universities on measures of academic and research performance. Although often useful for marketing purposes, the value of ranking systems when examining quality and outcomes is unclear. The purpose of this study was to evaluate usefulness of ranking systems and identify opportunities to support research quality and performance improvement. Methods: A systematic review of university ranking systems was conducted to investigate research performance and academic quality measures. Eligibility requirements included: inclusion of at least 100 doctoral granting institutions, be currently produced on an ongoing basis and include both global and US universities, publish rank calculation methodology in English and independently calculate ranks. Ranking systems must also include some measures of research outcomes. Indicators were abstracted and contrasted with basic quality improvement requirements. Exploration of aggregation methods, validity of research and academic quality indicators, and suitability for quality improvement within ranking systems were also conducted. Results: A total of 24 ranking systems were identified and 13 eligible ranking systems were evaluated. Six of the 13 rankings are 100% focused on research performance. For those reporting weighting, 76% of the total ranks are attributed to research indicators, with 24% attributed to academic or teaching quality. Seven systems rely on reputation surveys and/or faculty and alumni awards. Rankings influence academic choice yet research performance measures are the most weighted indicators. There are no generally accepted academic quality indicators in ranking systems. Discussion: No single ranking system provides a comprehensive evaluation of research and academic quality. Utilizing a combined approach of the Leiden, Thomson Reuters Most Innovative Universities, and the SCImago ranking systems may provide institutions with a more effective feedback for research improvement. Rankings which extensively rely on subjective reputation and “luxury” indicators, such as award winning faculty or alumni who are high ranking executives, are not well suited for academic or research performance improvement initiatives. Future efforts should better explore measurement of the university research performance through comprehensive and standardized indicators. This paper could serve as a general literature citation when one or more of university ranking systems are used in efforts to improve academic prominence and research performance.

Suggested Citation

  • Marlo M Vernon & E Andrew Balas & Shaher Momani, 2018. "Are university rankings useful to improve research? A systematic review," PLOS ONE, Public Library of Science, vol. 13(3), pages 1-15, March.
  • Handle: RePEc:plo:pone00:0193762
    DOI: 10.1371/journal.pone.0193762
    as

    Download full text from publisher

    File URL: https://journals.plos.org/plosone/article?id=10.1371/journal.pone.0193762
    Download Restriction: no

    File URL: https://journals.plos.org/plosone/article/file?id=10.1371/journal.pone.0193762&type=printable
    Download Restriction: no

    File URL: https://libkey.io/10.1371/journal.pone.0193762?utm_source=ideas
    LibKey link: if access is restricted and if your library uses this service, LibKey will redirect you to where you can use your library subscription to access this item
    ---><---

    References listed on IDEAS

    as
    1. John P A Ioannidis, 2005. "Why Most Published Research Findings Are False," PLOS Medicine, Public Library of Science, vol. 2(8), pages 1-1, August.
    2. Bruno S. Frey & Katja Rost, 2010. "Do rankings reflect research quality?," Journal of Applied Economics, Universidad del CEMA, vol. 13, pages 1-38, May.
    3. Ludo Waltman & Nees Jan Eck, 2012. "A new methodology for constructing a publication-level classification system of science," Journal of the Association for Information Science & Technology, Association for Information Science & Technology, vol. 63(12), pages 2378-2392, December.
    4. Carolyn J. Heinrich & Gerald Marschke, 2010. "Incentives and their dynamics in public sector performance management systems," Journal of Policy Analysis and Management, John Wiley & Sons, Ltd., vol. 29(1), pages 183-208.
    5. David Moher & Alessandro Liberati & Jennifer Tetzlaff & Douglas G Altman & The PRISMA Group, 2009. "Preferred Reporting Items for Systematic Reviews and Meta-Analyses: The PRISMA Statement," PLOS Medicine, Public Library of Science, vol. 6(7), pages 1-6, July.
    6. Saisana, Michaela & d'Hombres, Béatrice & Saltelli, Andrea, 2011. "Rickety numbers: Volatility of university rankings and policy implications," Research Policy, Elsevier, vol. 40(1), pages 165-177, February.
    7. Isidro F. Aguillo & Judit Bar-Ilan & Mark Levene & José Luis Ortega, 2010. "Comparing university rankings," Scientometrics, Springer;Akadémiai Kiadó, vol. 85(1), pages 243-256, October.
    8. Ludo Waltman & Nees Jan van Eck, 2012. "A new methodology for constructing a publication‐level classification system of science," Journal of the American Society for Information Science and Technology, Association for Information Science & Technology, vol. 63(12), pages 2378-2392, December.
    9. C. Glenn Begley & Lee M. Ellis, 2012. "Raise standards for preclinical cancer research," Nature, Nature, vol. 483(7391), pages 531-533, March.
    10. Henk F. Moed, 2017. "A critical comparative analysis of five world university rankings," Scientometrics, Springer;Akadémiai Kiadó, vol. 110(2), pages 967-990, February.
    Full references (including those not matched with items on IDEAS)

    Citations

    Citations are extracted by the CitEc Project, subscribe to its RSS feed for this item.
    as


    Cited by:

    1. Ye Sun & Athen Ma & Georg von Graevenitz & Vito Latora, 2023. "The importance of quality in austere times: University competitiveness and grant income," Papers 2309.15309, arXiv.org.
    2. Gustavo Vaccaro & Pablo Sánchez-Núñez & Patricia Witt-Rodríguez, 2022. "Bibliometrics Evaluation of Scientific Journals and Country Research Output of Dental Research in Latin America Using Scimago Journal and Country Rank," Publications, MDPI, vol. 10(3), pages 1-22, August.
    3. Önder, Ali Sina & Schweitzer, Sascha & Yilmazkuday, Hakan, 2021. "Specialization, field distance, and quality in economists’ collaborations," Journal of Informetrics, Elsevier, vol. 15(4).
    4. Gerhard Reichmann & Christian Schlögl, 2022. "On the possibilities of presenting the research performance of an institute over a long period of time: the case of the Institute of Information Science at the University of Graz in Austria," Scientometrics, Springer;Akadémiai Kiadó, vol. 127(6), pages 3193-3223, June.
    5. Neszveda, Gábor & Horváth, Zsófia, 2024. "Új szempont a magyar felsőoktatási intézmények teljesítményének mérésében - az egyetemek online láthatósága [Online visibility - A new aspect of measuring the performance of Hungarian higher educat," Közgazdasági Szemle (Economic Review - monthly of the Hungarian Academy of Sciences), Közgazdasági Szemle Alapítvány (Economic Review Foundation), vol. 0(7), pages 755-790.
    6. Benedetto Lepori & Aldo Geuna & Antonietta Mira, 2019. "Scientific output scales with resources. A comparison of US and European universities," PLOS ONE, Public Library of Science, vol. 14(10), pages 1-18, October.
    7. Raminta Pranckutė, 2021. "Web of Science (WoS) and Scopus: The Titans of Bibliographic Information in Today’s Academic World," Publications, MDPI, vol. 9(1), pages 1-59, March.
    8. Meng-Chen Zhang & Bo-Wei Zhu & Chao-Meng Huang & Gwo-Hshiung Tzeng, 2021. "Systematic Evaluation Model for Developing Sustainable World-Class Universities: An East Asian Perspective," Mathematics, MDPI, vol. 9(8), pages 1-20, April.
    9. Wysocka Karolina & Jungnickel Christian & Szelągowska-Rudzka Katarzyna, 2022. "Internationalization and Quality Assurance in Higher Education," Management, Sciendo, vol. 26(1), pages 204-230, January.
    10. Cinzia Daraio & Simone Di Leo & Loet Leydesdorff, 2022. "Using the Leiden Rankings as a Heuristics: Evidence from Italian universities in the European landscape," LEM Papers Series 2022/08, Laboratory of Economics and Management (LEM), Sant'Anna School of Advanced Studies, Pisa, Italy.
    11. repec:oup:rseval:v:32:y:2024:i:3:p:545-556. is not listed on IDEAS
    12. Santini, Mateus Augusto Fassina & Faccin, Kadígia & Balestrin, Alsones & Volkmer Martins, Bibiana, 2021. "How the relational structure of universities influences research and development results," Journal of Business Research, Elsevier, vol. 125(C), pages 155-163.
    13. Massucci, Francesco Alessandro & Docampo, Domingo, 2019. "Measuring the academic reputation through citation networks via PageRank," Journal of Informetrics, Elsevier, vol. 13(1), pages 185-201.
    14. Thomas Zacharewicz & Noemi Pulido Pavón & Luis Antonio & Benedetto Lepori, 2023. "Do funding modes matter? A multilevel analysis of funding allocation mechanisms on university research performance," Research Evaluation, Oxford University Press, vol. 32(3), pages 545-556.
    15. Cinzia Daraio & Simone Di Leo & Loet Leydesdorff, 2023. "A heuristic approach based on Leiden rankings to identify outliers: evidence from Italian universities in the European landscape," Scientometrics, Springer;Akadémiai Kiadó, vol. 128(1), pages 483-510, January.
    16. Zorica Lazić & Aleksandar Đorđević & Albina Gazizulina, 2021. "Improvement of Quality of Higher Education Institutions as a Basis for Improvement of Quality of Life," Sustainability, MDPI, vol. 13(8), pages 1-27, April.
    17. Hong Li & Zilin Chen, 2022. "A Comprehensive Evaluation Framework to Assess the Sustainable Development of Schools within a University: Application to a Chinese University," Sustainability, MDPI, vol. 14(17), pages 1-12, August.
    18. Syed Haider Khalil & Syed Mohsin Ali Shah & Fahad Sultan & Muhammad Ibrahim Khan & Sher Nawaz, 2023. "Categories and Institutional Change: Contesting the Uncontested Space Through National Rankings," SAGE Open, , vol. 13(3), pages 21582440231, September.
    19. Ali Sina Önder & Sascha Schweitzer & Hakan Yilmazkuday, 2021. "Field Distance and Quality in Economists’ Collaborations," Working Papers in Economics & Finance 2021-04, University of Portsmouth, Portsmouth Business School, Economics and Finance Subject Group.
    20. Marlo M. Vernon & Frances M. Yang, 2023. "Use of Latent Profile Analysis to Model the Translation of University Research into Health Practice and Policy: Exploration of Proposed Metrics," Research in Higher Education, Springer;Association for Institutional Research, vol. 64(7), pages 1058-1070, November.

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Nadine Desrochers & Adèle Paul‐Hus & Jen Pecoskie, 2017. "Five decades of gratitude: A meta‐synthesis of acknowledgments research," Journal of the Association for Information Science & Technology, Association for Information Science & Technology, vol. 68(12), pages 2821-2833, December.
    2. Emilio Abad-Segura & Alfonso Infante-Moro & Mariana-Daniela González-Zamar & Eloy López-Meneses, 2021. "Blockchain Technology for Secure Accounting Management: Research Trends Analysis," Mathematics, MDPI, vol. 9(14), pages 1-26, July.
    3. Antonio Perianes-Rodriguez & Javier Ruiz-Castillo, 2016. "University citation distributions," Journal of the Association for Information Science & Technology, Association for Information Science & Technology, vol. 67(11), pages 2790-2804, November.
    4. Angel Meseguer-Martinez & Alejandro Ros-Galvez & Alfonso Rosa-Garcia & Jose Antonio Catalan-Alarcon, 2019. "Online video impact of world class universities," Electronic Markets, Springer;IIM University of St. Gallen, vol. 29(3), pages 519-532, September.
    5. S. P. J. M. Horbach & W. Halffman, 2019. "The ability of different peer review procedures to flag problematic publications," Scientometrics, Springer;Akadémiai Kiadó, vol. 118(1), pages 339-373, January.
    6. Vicente Safón, 2019. "Inter-ranking reputational effects: an analysis of the Academic Ranking of World Universities (ARWU) and the Times Higher Education World University Rankings (THE) reputational relationship," Scientometrics, Springer;Akadémiai Kiadó, vol. 121(2), pages 897-915, November.
    7. Yanto Chandra, 2018. "Mapping the evolution of entrepreneurship as a field of research (1990–2013): A scientometric analysis," PLOS ONE, Public Library of Science, vol. 13(1), pages 1-24, January.
    8. Diego Chavarro & Puay Tang & Ismael Rafols, 2014. "Interdisciplinarity and research on local issues: evidence from a developing country," Research Evaluation, Oxford University Press, vol. 23(3), pages 195-209.
    9. María Pinto & Rosaura Fernández-Pascual & David Caballero-Mariscal & Dora Sales, 2020. "Information literacy trends in higher education (2006–2019): visualizing the emerging field of mobile information literacy," Scientometrics, Springer;Akadémiai Kiadó, vol. 124(2), pages 1479-1510, August.
    10. K. J. Wang & J. Widagdo & Y. S. Lin & H. L. Yang & S. L. Hsiao, 2016. "A service innovation framework for start-up firms by integrating service experience engineering approach and capability maturity model," Service Business, Springer;Pan-Pacific Business Association, vol. 10(4), pages 867-916, December.
    11. Juan Miguel Campanario, 2018. "Are leaders really leading? Journals that are first in Web of Science subject categories in the context of their groups," Scientometrics, Springer;Akadémiai Kiadó, vol. 115(1), pages 111-130, April.
    12. Yongjun Zhu & Erjia Yan, 2015. "Dynamic subfield analysis of disciplines: an examination of the trading impact and knowledge diffusion patterns of computer science," Scientometrics, Springer;Akadémiai Kiadó, vol. 104(1), pages 335-359, July.
    13. Ignacio Rodríguez-Rodríguez & José-Víctor Rodríguez & Niloofar Shirvanizadeh & Andrés Ortiz & Domingo-Javier Pardo-Quiles, 2021. "Applications of Artificial Intelligence, Machine Learning, Big Data and the Internet of Things to the COVID-19 Pandemic: A Scientometric Review Using Text Mining," IJERPH, MDPI, vol. 18(16), pages 1-29, August.
    14. Llopis, Oscar & D'Este, Pablo & McKelvey, Maureen & Yegros, Alfredo, 2022. "Navigating multiple logics: Legitimacy and the quest for societal impact in science," Technovation, Elsevier, vol. 110(C).
    15. Csató, László & Tóth, Csaba, 2020. "University rankings from the revealed preferences of the applicants," European Journal of Operational Research, Elsevier, vol. 286(1), pages 309-320.
    16. Colin F. Camerer & Anna Dreber & Felix Holzmeister & Teck-Hua Ho & Jürgen Huber & Magnus Johannesson & Michael Kirchler & Gideon Nave & Brian A. Nosek & Thomas Pfeiffer & Adam Altmejd & Nick Buttrick , 2018. "Evaluating the replicability of social science experiments in Nature and Science between 2010 and 2015," Nature Human Behaviour, Nature, vol. 2(9), pages 637-644, September.
    17. Murat Perit Çakır & Cengiz Acartürk & Oğuzhan Alaşehir & Canan Çilingir, 2015. "A comparative analysis of global and national university ranking systems," Scientometrics, Springer;Akadémiai Kiadó, vol. 103(3), pages 813-848, June.
    18. Perianes-Rodriguez, Antonio & Ruiz-Castillo, Javier, 2017. "A comparison of the Web of Science and publication-level classification systems of science," Journal of Informetrics, Elsevier, vol. 11(1), pages 32-45.
    19. Matteo M. Galizzi & Daniel Navarro-Martinez, 2019. "On the External Validity of Social Preference Games: A Systematic Lab-Field Study," Management Science, INFORMS, vol. 65(3), pages 976-1002, March.
    20. Rosa Rodriguez-Sánchez & J. A. García & J. Fdez-Valdivia, 2014. "Evolutionary games between subject categories," Scientometrics, Springer;Akadémiai Kiadó, vol. 101(1), pages 869-888, October.

    More about this item

    Statistics

    Access and download statistics

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:plo:pone00:0193762. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: plosone (email available below). General contact details of provider: https://journals.plos.org/plosone/ .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.