IDEAS home Printed from https://ideas.repec.org/a/spr/scient/v126y2021i9d10.1007_s11192-021-04030-w.html
   My bibliography  Save this article

Development of a tool to accurately predict UK REF funding allocation

Author

Listed:
  • Shahd Al-Janabi

    (Charles Darwin University)

  • Lee Wei Lim

    (The University of Hong Kong)

  • Luca Aquili

    (Charles Darwin University
    The University of Hong Kong)

Abstract

Understanding the determinants of research funding allocation by funding bodies, such as the Research Excellence Framework (REF) in the UK, is vital to help institutions prepare for their research quality assessments. In these assessments, only publications ranked as 4* or 3* (but not 2* or less) would receive funding. Correlational studies have shown that the impact factor (IF) of a publication is associated with REF rankings. Yet, the precise IF boundaries leading to each rank are unknown; for example, would a publication with an IF of 5 be ranked 4* or less? Here, we provide a tool that predicts the rank of each submitted publication to (1) help researchers choose a publication outlet that would more likely lead to the submission of their research output(s) by faculty heads in the next REF assessment, thereby potentially improving their academic profile; and (2) help faculty heads decide which outputs to submit for assessment, thereby maximising their future REF scores and ultimately their research funding. Initially, we applied our tool to the REF (: Institutions Ranked by Subject (2014). https://www.timeshighereducation.com/sites/default/files/Attachments/2014/12/17/g/o/l/sub-14-01.pdf .)) results for Neuroscience, Psychiatry, and Psychology, which predicted publications ranked 4* with 95% accuracy (IF ≥ 6.5), 3* with 98% accuracy (IF= 2.9–6.49), and 2* with 95% accuracy (IF= 1.3–2.89); thus indicating that researchers wishing to increase their chances of a 4* rating for the aforementioned Unit of Assessment should submit to journals with IFs of at least 6.5. We then generalised these findings to another REF unit of assessment: Biological Sciences to further demonstrate the predictive capacity of our tool.

Suggested Citation

  • Shahd Al-Janabi & Lee Wei Lim & Luca Aquili, 2021. "Development of a tool to accurately predict UK REF funding allocation," Scientometrics, Springer;Akadémiai Kiadó, vol. 126(9), pages 8049-8062, September.
  • Handle: RePEc:spr:scient:v:126:y:2021:i:9:d:10.1007_s11192-021-04030-w
    DOI: 10.1007/s11192-021-04030-w
    as

    Download full text from publisher

    File URL: http://link.springer.com/10.1007/s11192-021-04030-w
    File Function: Abstract
    Download Restriction: Access to the full text of the articles in this series is restricted.

    File URL: https://libkey.io/10.1007/s11192-021-04030-w?utm_source=ideas
    LibKey link: if access is restricted and if your library uses this service, LibKey will redirect you to where you can use your library subscription to access this item
    ---><---

    As the access to this document is restricted, you may want to search for a different version of it.

    References listed on IDEAS

    as
    1. Győrffy, Balázs & Herman, Péter & Szabó, István, 2020. "Research funding: past performance is a stronger predictor of future scientific output than reviewer scores," Journal of Informetrics, Elsevier, vol. 14(3).
    2. Benjamin M. Althouse & Jevin D. West & Carl T. Bergstrom & Theodore Bergstrom, 2009. "Differences in impact factor across fields and over time," Journal of the American Society for Information Science and Technology, Association for Information Science & Technology, vol. 60(1), pages 27-34, January.
    3. Rob Law & Daniel Leung, 2020. "Journal impact factor: A valid symbol of journal quality?," Tourism Economics, , vol. 26(5), pages 734-742, August.
    4. Daniel E. Acuna & Stefano Allesina & Konrad P. Kording, 2012. "Predicting scientific success," Nature, Nature, vol. 489(7415), pages 201-202, September.
    5. Stevan Harnad, 2009. "Open access scientometrics and the UK Research Assessment Exercise," Scientometrics, Springer;Akadémiai Kiadó, vol. 79(1), pages 147-156, April.
    6. O. Mryglod & R. Kenna & Yu. Holovatch & B. Berche, 2015. "Predicting results of the research excellence framework using departmental h-index: revisited," Scientometrics, Springer;Akadémiai Kiadó, vol. 104(3), pages 1013-1017, September.
    7. Ewen Callaway, 2016. "Beat it, impact factor! Publishing elite turns against controversial metric," Nature, Nature, vol. 535(7611), pages 210-211, July.
    8. O. Mryglod & R. Kenna & Yu. Holovatch & B. Berche, 2015. "Predicting results of the Research Excellence Framework using departmental h-index," Scientometrics, Springer;Akadémiai Kiadó, vol. 102(3), pages 2165-2180, March.
    9. Diana Hicks & Paul Wouters & Ludo Waltman & Sarah de Rijcke & Ismael Rafols, 2015. "Bibliometrics: The Leiden Manifesto for research metrics," Nature, Nature, vol. 520(7548), pages 429-431, April.
    10. Lloyd D Balbuena, 2018. "The UK Research Excellence Framework and the Matthew effect: Insights from machine learning," PLOS ONE, Public Library of Science, vol. 13(11), pages 1-13, November.
    11. Saarela, Mirka & Kärkkäinen, Tommi & Lahtonen, Tommi & Rossi, Tuomo, 2016. "Expert-based versus citation-based ranking of scholarly and scientific publication channels," Journal of Informetrics, Elsevier, vol. 10(3), pages 693-718.
    Full references (including those not matched with items on IDEAS)

    Citations

    Citations are extracted by the CitEc Project, subscribe to its RSS feed for this item.
    as


    Cited by:

    1. William E Savage & Anthony J Olejniczak, 2022. "More journal articles and fewer books: Publication practices in the social sciences in the 2010’s," PLOS ONE, Public Library of Science, vol. 17(2), pages 1-16, February.

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Thelwall, Mike & Kousha, Kayvan & Stuart, Emma & Makita, Meiko & Abdoli, Mahshid & Wilson, Paul & Levitt, Jonathan, 2023. "Do bibliometrics introduce gender, institutional or interdisciplinary biases into research evaluations?," Research Policy, Elsevier, vol. 52(8).
    2. Daniele Checchi & Alberto Ciolfi & Gianni De Fraja & Irene Mazzotta & Stefano Verzillo, 2021. "Have you Read This? An Empirical Comparison of the British REF Peer Review and the Italian VQR Bibliometric Algorithm," Economica, London School of Economics and Political Science, vol. 88(352), pages 1107-1129, October.
    3. Fernandez Martinez, Roberto & Lostado Lorza, Ruben & Santos Delgado, Ana Alexandra & Piedra, Nelson, 2021. "Use of classification trees and rule-based models to optimize the funding assignment to research projects: A case study of UTPL," Journal of Informetrics, Elsevier, vol. 15(1).
    4. Battiston, Pietro & Sacco, Pier Luigi & Stanca, Luca, 2022. "Cover effects on citations uncovered: Evidence from Nature," Journal of Informetrics, Elsevier, vol. 16(2).
    5. James Tooley & Barrie Craven, 2018. "Private Sector Alternatives to the Research Excellence Framework for University League Tables," Economic Affairs, Wiley Blackwell, vol. 38(3), pages 434-443, October.
    6. Alberto Baccini & Giuseppe De Nicolao, 2016. "Do they agree? Bibliometric evaluation versus informed peer review in the Italian research assessment exercise," Scientometrics, Springer;Akadémiai Kiadó, vol. 108(3), pages 1651-1671, September.
    7. Torres-Salinas, Daniel & Valderrama-Baca, Pilar & Arroyo-Machado, Wenceslao, 2022. "Is there a need for a new journal metric? Correlations between JCR Impact Factor metrics and the Journal Citation Indicator—JCI," Journal of Informetrics, Elsevier, vol. 16(3).
    8. Milojević, Staša & Radicchi, Filippo & Bar-Ilan, Judit, 2017. "Citation success index − An intuitive pair-wise journal comparison metric," Journal of Informetrics, Elsevier, vol. 11(1), pages 223-231.
    9. Sánchez-Gil, Susana & Gorraiz, Juan & Melero-Fuentes, David, 2018. "Reference density trends in the major disciplines," Journal of Informetrics, Elsevier, vol. 12(1), pages 42-58.
    10. Petersen, Alexander M. & Pan, Raj K. & Pammolli, Fabio & Fortunato, Santo, 2019. "Methods to account for citation inflation in research evaluation," Research Policy, Elsevier, vol. 48(7), pages 1855-1865.
    11. Kulczycki, Emanuel & Korzeń, Marcin & Korytkowski, Przemysław, 2017. "Toward an excellence-based research funding system: Evidence from Poland," Journal of Informetrics, Elsevier, vol. 11(1), pages 282-298.
    12. John P A Ioannidis & Kevin Boyack & Paul F Wouters, 2016. "Citation Metrics: A Primer on How (Not) to Normalize," PLOS Biology, Public Library of Science, vol. 14(9), pages 1-7, September.
    13. Giovanni Abramo & Ciriaco Andrea D’Angelo & Emanuela Reale, 2019. "Peer review versus bibliometrics: Which method better predicts the scholarly impact of publications?," Scientometrics, Springer;Akadémiai Kiadó, vol. 121(1), pages 537-554, October.
    14. Živan Živković & Marija Panić, 2020. "Development of science and education in the Western Balkan countries: competitiveness with the EU," Scientometrics, Springer;Akadémiai Kiadó, vol. 124(3), pages 2319-2339, September.
    15. Siler, Kyle & Larivière, Vincent, 2022. "Who games metrics and rankings? Institutional niches and journal impact factor inflation," Research Policy, Elsevier, vol. 51(10).
    16. Wumei Du & Zheng Xie & Yiqin Lv, 2021. "Predicting publication productivity for authors: Shallow or deep architecture?," Scientometrics, Springer;Akadémiai Kiadó, vol. 126(7), pages 5855-5879, July.
    17. Basso, Antonella & di Tollo, Giacomo, 2022. "Prediction of UK research excellence framework assessment by the departmental h-index," European Journal of Operational Research, Elsevier, vol. 296(3), pages 1036-1049.
    18. Banal-Estañol, Albert & Jofre-Bonet, Mireia & Iori, Giulia & Maynou, Laia & Tumminello, Michele & Vassallo, Pietro, 2023. "Performance-based research funding: Evidence from the largest natural experiment worldwide," Research Policy, Elsevier, vol. 52(6).
    19. Drivas, Kyriakos & Kremmydas, Dimitris, 2020. "The Matthew effect of a journal's ranking," Research Policy, Elsevier, vol. 49(4).
    20. Matthias Kuppler, 2022. "Predicting the future impact of Computer Science researchers: Is there a gender bias?," Scientometrics, Springer;Akadémiai Kiadó, vol. 127(11), pages 6695-6732, November.

    More about this item

    Keywords

    REF; Impact factor; Metrics; Funding;
    All these keywords.

    Statistics

    Access and download statistics

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:spr:scient:v:126:y:2021:i:9:d:10.1007_s11192-021-04030-w. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: Sonal Shukla or Springer Nature Abstracting and Indexing (email available below). General contact details of provider: http://www.springer.com .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.