IDEAS home Printed from https://ideas.repec.org/a/eee/infome/v11y2017i2p389-406.html
   My bibliography  Save this article

Uncovering fine-grained research excellence: The global research benchmarking system

Author

Listed:
  • Haddawy, Peter
  • Hassan, Saeed-Ul
  • Abbey, Craig W.
  • Lee, Inn Beng

Abstract

Since few universities can afford to be excellent in all subject areas, university administrators face the difficult decision of selecting areas for strategic investment. While the past decade has seen a proliferation of university ranking systems, several aspects in the design of most ranking systems make them inappropriate to benchmark performance in a way that supports formulation of effective institutional research strategy. To support strategic decision making, universities require research benchmarking data that is sufficiently fine-grained to show variation among specific research areas and identify focused areas of excellence; is objective and verifiable; and provides meaningful comparisons across the diversity of national higher education environments. This paper describes the Global Research Benchmarking System (GRBS) which satisfies these requirements by providing fine-grained objective data to internationally benchmark university research performance in over 250 areas of Science and Technology. We provide analyses of research performance at country and university levels, using the diversity of indicators in GRBS to examine distributions of research quality in countries and universities as well as to contrast university research performance from volume and quality perspectives. A comparison of the GRBS results with those of the three predominant ranking systems shows how GRBS is able to identify pockets of excellence within universities that are overlooked by the more traditional aggregate level approaches.

Suggested Citation

  • Haddawy, Peter & Hassan, Saeed-Ul & Abbey, Craig W. & Lee, Inn Beng, 2017. "Uncovering fine-grained research excellence: The global research benchmarking system," Journal of Informetrics, Elsevier, vol. 11(2), pages 389-406.
  • Handle: RePEc:eee:infome:v:11:y:2017:i:2:p:389-406
    DOI: 10.1016/j.joi.2017.02.004
    as

    Download full text from publisher

    File URL: http://www.sciencedirect.com/science/article/pii/S1751157715300146
    Download Restriction: Full text for ScienceDirect subscribers only

    File URL: https://libkey.io/10.1016/j.joi.2017.02.004?utm_source=ideas
    LibKey link: if access is restricted and if your library uses this service, LibKey will redirect you to where you can use your library subscription to access this item
    ---><---

    As the access to this document is restricted, you may want to search for a different version of it.

    References listed on IDEAS

    as
    1. Ahlgren, Per & Waltman, Ludo, 2014. "The correlation between citation-based and expert-based assessments of publication channels: SNIP and SJR vs. Norwegian quality assessments," Journal of Informetrics, Elsevier, vol. 8(4), pages 985-996.
    2. Jamil Salmi, 2009. "The Challenge of Establishing World-Class Universities," World Bank Publications - Books, The World Bank Group, number 2600.
    3. Jean-Charles Billaut & Denis Bouyssou & Philippe Vincke, 2010. "Should you believe in the Shanghai ranking?," Scientometrics, Springer;Akadémiai Kiadó, vol. 84(1), pages 237-263, July.
    4. Jean-Charles Billaut & Denis Bouyssou & Philippe Vincke, 2010. "Should you believe in the Shanghai ranking?," Scientometrics, Springer;Akadémiai Kiadó, vol. 84(1), pages 237-263, July.
    5. Moed, Henk F., 2010. "Measuring contextual citation impact of scientific journals," Journal of Informetrics, Elsevier, vol. 4(3), pages 265-277.
    6. repec:dau:papers:123456789/2947 is not listed on IDEAS
    7. Ludo Waltman & Clara Calero-Medina & Joost Kosten & Ed C.M. Noyons & Robert J.W. Tijssen & Nees Jan Eck & Thed N. Leeuwen & Anthony F.J. Raan & Martijn S. Visser & Paul Wouters, 2012. "The Leiden ranking 2011/2012: Data collection, indicators, and interpretation," Journal of the Association for Information Science & Technology, Association for Information Science & Technology, vol. 63(12), pages 2419-2432, December.
    8. Haddawy, Peter & Hassan, Saeed-Ul & Asghar, Awais & Amin, Sarah, 2016. "A comprehensive examination of the relation of three citation-based journal metrics to expert judgment of journal quality," Journal of Informetrics, Elsevier, vol. 10(1), pages 162-173.
    Full references (including those not matched with items on IDEAS)

    Citations

    Citations are extracted by the CitEc Project, subscribe to its RSS feed for this item.
    as


    Cited by:

    1. Saeed-Ul Hassan & Mubashir Imran & Uzair Gillani & Naif Radi Aljohani & Timothy D. Bowman & Fereshteh Didegah, 2017. "Measuring social media activity of scientific literature: an exhaustive comparison of scopus and novel altmetrics big data," Scientometrics, Springer;Akadémiai Kiadó, vol. 113(2), pages 1037-1057, November.
    2. Bonaccorsi, Andrea & Haddawy, Peter & Cicero, Tindaro & Hassan, Saeed-Ul, 2017. "The solitude of stars. An analysis of the distributed excellence model of European universities," Journal of Informetrics, Elsevier, vol. 11(2), pages 435-454.
    3. Raminta Pranckutė, 2021. "Web of Science (WoS) and Scopus: The Titans of Bibliographic Information in Today’s Academic World," Publications, MDPI, vol. 9(1), pages 1-59, March.
    4. Saeed-Ul Hassan & Sehrish Iqbal & Naif R. Aljohani & Salem Alelyani & Alesia Zuccala, 2020. "Introducing the ‘alt-index’ for measuring the social visibility of scientific research," Scientometrics, Springer;Akadémiai Kiadó, vol. 123(3), pages 1407-1419, June.
    5. Bonaccorsi, Andrea & Belingheri, Paola & Secondi, Luca, 2021. "The research productivity of universities. A multilevel and multidisciplinary analysis on European institutions," Journal of Informetrics, Elsevier, vol. 15(2).

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Johan Lyhagen & Per Ahlgren, 2020. "Uncertainty and the ranking of economics journals," Scientometrics, Springer;Akadémiai Kiadó, vol. 125(3), pages 2545-2560, December.
    2. Walters, William H., 2017. "Do subjective journal ratings represent whole journals or typical articles? Unweighted or weighted citation impact?," Journal of Informetrics, Elsevier, vol. 11(3), pages 730-744.
    3. Massucci, Francesco Alessandro & Docampo, Domingo, 2019. "Measuring the academic reputation through citation networks via PageRank," Journal of Informetrics, Elsevier, vol. 13(1), pages 185-201.
    4. Juntao Zheng & Niancai Liu, 2015. "Mapping of important international academic awards," Scientometrics, Springer;Akadémiai Kiadó, vol. 104(3), pages 763-791, September.
    5. Henk F. Moed & Gali Halevi, 2015. "Multidimensional assessment of scholarly research impact," Journal of the Association for Information Science & Technology, Association for Information Science & Technology, vol. 66(10), pages 1988-2002, October.
    6. Saarela, Mirka & Kärkkäinen, Tommi, 2020. "Can we automate expert-based journal rankings? Analysis of the Finnish publication indicator," Journal of Informetrics, Elsevier, vol. 14(2).
    7. Bonaccorsi, Andrea & Haddawy, Peter & Cicero, Tindaro & Hassan, Saeed-Ul, 2017. "The solitude of stars. An analysis of the distributed excellence model of European universities," Journal of Informetrics, Elsevier, vol. 11(2), pages 435-454.
    8. D. Docampo & D. Egret & L. Cram, 2015. "The effect of university mergers on the Shanghai ranking," Scientometrics, Springer;Akadémiai Kiadó, vol. 104(1), pages 175-191, July.
    9. Enar Ruiz-Conde & Aurora Calderón-Martínez, 2014. "University institutional repositories: competitive environment and their role as communication media of scientific knowledge," Scientometrics, Springer;Akadémiai Kiadó, vol. 98(2), pages 1283-1299, February.
    10. Andrea Bonaccorsi & Tindaro Cicero & Peter Haddawy & Saeed-UL Hassan, 2017. "Explaining the transatlantic gap in research excellence," Scientometrics, Springer;Akadémiai Kiadó, vol. 110(1), pages 217-241, January.
    11. Aparna Basu & Sumit Kumar Banshal & Khushboo Singhal & Vivek Kumar Singh, 2016. "Designing a Composite Index for research performance evaluation at the national or regional level: ranking Central Universities in India," Scientometrics, Springer;Akadémiai Kiadó, vol. 107(3), pages 1171-1193, June.
    12. Gabriel-Alexandru Vîiu & Mihai Păunescu & Adrian Miroiu, 2016. "Research-driven classification and ranking in higher education: an empirical appraisal of a Romanian policy experience," Scientometrics, Springer;Akadémiai Kiadó, vol. 107(2), pages 785-805, May.
    13. M. L. Bougnol & J. H. Dulá, 2013. "A mathematical model to optimize decisions to impact multi-attribute rankings," Scientometrics, Springer;Akadémiai Kiadó, vol. 95(2), pages 785-796, May.
    14. El Gibari, Samira & Gómez, Trinidad & Ruiz, Francisco, 2018. "Evaluating university performance using reference point based composite indicators," Journal of Informetrics, Elsevier, vol. 12(4), pages 1235-1250.
    15. Khatab Alqararah, 2023. "Assessing the robustness of composite indicators: the case of the Global Innovation Index," Journal of Innovation and Entrepreneurship, Springer, vol. 12(1), pages 1-22, December.
    16. Franceschini, Fiorenzo & Maisano, Domenico, 2011. "Structured evaluation of the scientific output of academic research groups by recent h-based indicators," Journal of Informetrics, Elsevier, vol. 5(1), pages 64-74.
    17. Zofio, Jose Luis & Aparicio, Juan & Barbero, Javier & Zabala-Iturriagagoitia, Jon Mikel, 2023. "The influence of bottlenecks on innovation systems performance: Put the slowest climber first," Technological Forecasting and Social Change, Elsevier, vol. 193(C).
    18. Saisana, Michaela & d'Hombres, Béatrice & Saltelli, Andrea, 2011. "Rickety numbers: Volatility of university rankings and policy implications," Research Policy, Elsevier, vol. 40(1), pages 165-177, February.
    19. Carayannis, Elias G. & Grigoroudis, Evangelos & Wurth, Bernd, 2022. "OR for entrepreneurial ecosystems: A problem-oriented review and agenda," European Journal of Operational Research, Elsevier, vol. 300(3), pages 791-808.
    20. Abramo, Giovanni & D’Angelo, Ciriaco Andrea, 2015. "Evaluating university research: Same performance indicator, different rankings," Journal of Informetrics, Elsevier, vol. 9(3), pages 514-525.

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:eee:infome:v:11:y:2017:i:2:p:389-406. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: Catherine Liu (email available below). General contact details of provider: http://www.elsevier.com/locate/joi .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.