IDEAS home Printed from https://ideas.repec.org/a/spr/scient/v102y2015i3d10.1007_s11192-014-1512-3.html
   My bibliography  Save this article

Predicting results of the Research Excellence Framework using departmental h-index

Author

Listed:
  • O. Mryglod

    (National Academy of Sciences of Ukraine)

  • R. Kenna

    (Coventry University)

  • Yu. Holovatch

    (National Academy of Sciences of Ukraine)

  • B. Berche

    (Université de Lorraine, Statistical Physics Group, IJL, UMR CNRS 7198)

Abstract

We compare estimates for past institutional research performances coming from two bibliometric indicators to the results of the UK’s Research Assessment Exercise which last took place in 2008. We demonstrate that a version of the departmental h-index is better correlated with the actual results of that peer-review exercise than a competing metric known as the normalised citation-based indicator. We then determine the corresponding h-indices for 2008–2013, the period examined in the UK’s Research Excellence Framework (REF) 2014. We place herewith the resulting predictions on the arXiv in advance of the REF results being published (December 2014). These may be considered as unbiased predictions of relative performances in that exercise. We will revisit this paper after the REF results are available and comment on the reliability or otherwise of these bibliometrics as compared with peer review.

Suggested Citation

  • O. Mryglod & R. Kenna & Yu. Holovatch & B. Berche, 2015. "Predicting results of the Research Excellence Framework using departmental h-index," Scientometrics, Springer;Akadémiai Kiadó, vol. 102(3), pages 2165-2180, March.
  • Handle: RePEc:spr:scient:v:102:y:2015:i:3:d:10.1007_s11192-014-1512-3
    DOI: 10.1007/s11192-014-1512-3
    as

    Download full text from publisher

    File URL: http://link.springer.com/10.1007/s11192-014-1512-3
    File Function: Abstract
    Download Restriction: Access to the full text of the articles in this series is restricted.

    File URL: https://libkey.io/10.1007/s11192-014-1512-3?utm_source=ideas
    LibKey link: if access is restricted and if your library uses this service, LibKey will redirect you to where you can use your library subscription to access this item
    ---><---

    As the access to this document is restricted, you may want to search for a different version of it.

    References listed on IDEAS

    as
    1. O. Mryglod & R. Kenna & Yu. Holovatch & B. Berche, 2013. "Absolute and specific measures of research group excellence," Scientometrics, Springer;Akadémiai Kiadó, vol. 95(1), pages 115-127, April.
    2. Michael H. MacRoberts & Barbara R. MacRoberts, 1989. "Problems of citation analysis: A critical review," Journal of the American Society for Information Science, Association for Information Science & Technology, vol. 40(5), pages 342-349, September.
    3. Jean-Francois Molinari & Alain Molinari, 2008. "A new methodology for ranking scientific institutions," Scientometrics, Springer;Akadémiai Kiadó, vol. 75(1), pages 163-174, April.
    4. Anthony F. J. Raan, 2006. "Comparison of the Hirsch-index with standard bibliometric indicators and with peer judgment for 147 chemistry research groups," Scientometrics, Springer;Akadémiai Kiadó, vol. 67(3), pages 491-502, June.
    5. O. Mryglod & R. Kenna & Yu. Holovatch & B. Berche, 2013. "Comparison of a citation-based indicator and peer review for absolute and specific measures of research-group excellence," Scientometrics, Springer;Akadémiai Kiadó, vol. 97(3), pages 767-777, December.
    Full references (including those not matched with items on IDEAS)

    Citations

    Citations are extracted by the CitEc Project, subscribe to its RSS feed for this item.
    as


    Cited by:

    1. James Tooley & Barrie Craven, 2018. "Private Sector Alternatives to the Research Excellence Framework for University League Tables," Economic Affairs, Wiley Blackwell, vol. 38(3), pages 434-443, October.
    2. Stephan B. Bruns & David I. Stern, 2016. "Research assessment using early citation information," Scientometrics, Springer;Akadémiai Kiadó, vol. 108(2), pages 917-935, August.
    3. Alberto Baccini & Giuseppe De Nicolao, 2016. "Do they agree? Bibliometric evaluation versus informed peer review in the Italian research assessment exercise," Scientometrics, Springer;Akadémiai Kiadó, vol. 108(3), pages 1651-1671, September.
    4. Daniele Checchi & Alberto Ciolfi & Gianni De Fraja & Irene Mazzotta & Stefano Verzillo, 2021. "Have you Read This? An Empirical Comparison of the British REF Peer Review and the Italian VQR Bibliometric Algorithm," Economica, London School of Economics and Political Science, vol. 88(352), pages 1107-1129, October.
    5. Thelwall, Mike & Kousha, Kayvan & Stuart, Emma & Makita, Meiko & Abdoli, Mahshid & Wilson, Paul & Levitt, Jonathan, 2023. "Do bibliometrics introduce gender, institutional or interdisciplinary biases into research evaluations?," Research Policy, Elsevier, vol. 52(8).
    6. Shahd Al-Janabi & Lee Wei Lim & Luca Aquili, 2021. "Development of a tool to accurately predict UK REF funding allocation," Scientometrics, Springer;Akadémiai Kiadó, vol. 126(9), pages 8049-8062, September.
    7. Basso, Antonella & di Tollo, Giacomo, 2022. "Prediction of UK research excellence framework assessment by the departmental h-index," European Journal of Operational Research, Elsevier, vol. 296(3), pages 1036-1049.
    8. Giovanni Abramo & Ciriaco Andrea D’Angelo & Emanuela Reale, 2019. "Peer review versus bibliometrics: Which method better predicts the scholarly impact of publications?," Scientometrics, Springer;Akadémiai Kiadó, vol. 121(1), pages 537-554, October.
    9. Banal-Estañol, Albert & Jofre-Bonet, Mireia & Iori, Giulia & Maynou, Laia & Tumminello, Michele & Vassallo, Pietro, 2023. "Performance-based research funding: Evidence from the largest natural experiment worldwide," Research Policy, Elsevier, vol. 52(6).
    10. Lloyd D Balbuena, 2018. "The UK Research Excellence Framework and the Matthew effect: Insights from machine learning," PLOS ONE, Public Library of Science, vol. 13(11), pages 1-13, November.

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. John Panaretos & Chrisovaladis Malesios, 2009. "Assessing scientific research performance and impact with single indices," Scientometrics, Springer;Akadémiai Kiadó, vol. 81(3), pages 635-670, December.
    2. Franceschini, Fiorenzo & Maisano, Domenico, 2011. "Structured evaluation of the scientific output of academic research groups by recent h-based indicators," Journal of Informetrics, Elsevier, vol. 5(1), pages 64-74.
    3. Petridis, Konstantinos & Malesios, Chrisovalantis & Arabatzis, Garyfallos & Thanassoulis, Emmanuel, 2013. "Efficiency analysis of forestry journals: Suggestions for improving journals’ quality," Journal of Informetrics, Elsevier, vol. 7(2), pages 505-521.
    4. Paul Benneworth, 2015. "Between certainty and comprehensiveness in evaluating the societal impact of humanities research," CHEPS Working Papers 201502, University of Twente, Center for Higher Education Policy Studies (CHEPS).
    5. Hana Tomaskova & Martin Kopecky, 2020. "Specialization of Business Process Model and Notation Applications in Medicine—A Review," Data, MDPI, vol. 5(4), pages 1-42, October.
    6. R. Álvarez & E. Cahué & J. Clemente-Gallardo & A. Ferrer & D. Íñiguez & X. Mellado & A. Rivero & G. Ruiz & F. Sanz & E. Serrano & A. Tarancón & Y. Vergara, 2015. "Analysis of academic productivity based on Complex Networks," Scientometrics, Springer;Akadémiai Kiadó, vol. 104(3), pages 651-672, September.
    7. Kuan, Chung-Huei & Huang, Mu-Hsuan & Chen, Dar-Zen, 2013. "Cross-field evaluation of publications of research institutes using their contributions to the fields’ MVPs determined by h-index," Journal of Informetrics, Elsevier, vol. 7(2), pages 455-468.
    8. Sebastian K. Boell & Concepción S. Wilson, 2010. "Journal Impact Factors for evaluating scientific performance: use of h-like indicators," Scientometrics, Springer;Akadémiai Kiadó, vol. 82(3), pages 613-626, March.
    9. Bornmann, Lutz & Mutz, Rüdiger & Hug, Sven E. & Daniel, Hans-Dieter, 2011. "A multilevel meta-analysis of studies reporting correlations between the h index and 37 different h index variants," Journal of Informetrics, Elsevier, vol. 5(3), pages 346-359.
    10. Vieira, E.S. & Gomes, J.A.N.F., 2010. "A research impact indicator for institutions," Journal of Informetrics, Elsevier, vol. 4(4), pages 581-590.
    11. Filippo Radicchi & Claudio Castellano, 2013. "Analysis of bibliometric indicators for individual scholars in a large data set," Scientometrics, Springer;Akadémiai Kiadó, vol. 97(3), pages 627-637, December.
    12. Thelwall, Mike & Wilson, Paul, 2014. "Regression for citation data: An evaluation of different methods," Journal of Informetrics, Elsevier, vol. 8(4), pages 963-971.
    13. Abramo, Giovanni, 2018. "Revisiting the scientometric conceptualization of impact and its measurement," Journal of Informetrics, Elsevier, vol. 12(3), pages 590-597.
    14. Gad Saad, 2010. "Applying the h-index in exploring bibliometric properties of elite marketing scholars," Scientometrics, Springer;Akadémiai Kiadó, vol. 83(2), pages 423-433, May.
    15. Dag W. Aksnes & Liv Langfeldt & Paul Wouters, 2019. "Citations, Citation Indicators, and Research Quality: An Overview of Basic Concepts and Theories," SAGE Open, , vol. 9(1), pages 21582440198, February.
    16. O. Mryglod & Yu. Holovatch & R. Kenna, 2022. "Big fish and small ponds: why the departmental h-index should not be used to rank universities," Scientometrics, Springer;Akadémiai Kiadó, vol. 127(6), pages 3279-3292, June.
    17. Liyin Zhang & Yuchen Qian & Chao Ma & Jiang Li, 2023. "Continued collaboration shortens the transition period of scientists who move to another institution," Scientometrics, Springer;Akadémiai Kiadó, vol. 128(3), pages 1765-1784, March.
    18. Franceschet, Massimo & Costantini, Antonio, 2011. "The first Italian research assessment exercise: A bibliometric perspective," Journal of Informetrics, Elsevier, vol. 5(2), pages 275-291.
    19. Stephan B. Bruns & David I. Stern, 2016. "Research assessment using early citation information," Scientometrics, Springer;Akadémiai Kiadó, vol. 108(2), pages 917-935, August.
    20. O. Mryglod & R. Kenna & Yu. Holovatch & B. Berche, 2015. "Predicting results of the research excellence framework using departmental h-index: revisited," Scientometrics, Springer;Akadémiai Kiadó, vol. 104(3), pages 1013-1017, September.

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:spr:scient:v:102:y:2015:i:3:d:10.1007_s11192-014-1512-3. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: Sonal Shukla or Springer Nature Abstracting and Indexing (email available below). General contact details of provider: http://www.springer.com .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.