IDEAS home Printed from https://ideas.repec.org/a/pal/palcom/v5y2019i1d10.1057_s41599-019-0233-x.html
   My bibliography  Save this article

Systematic analysis of agreement between metrics and peer review in the UK REF

Author

Listed:
  • V. A. Traag

    (Leiden University)

  • L. Waltman

    (Leiden University)

Abstract

When performing a national research assessment, some countries rely on citation metrics whereas others, such as the UK, primarily use peer review. In the influential Metric Tide report, a low agreement between metrics and peer review in the UK Research Excellence Framework (REF) was found. However, earlier studies observed much higher agreement between metrics and peer review in the REF and argued in favour of using metrics. This shows that there is considerable ambiguity in the discussion on agreement between metrics and peer review. We provide clarity in this discussion by considering four important points: (1) the level of aggregation of the analysis; (2) the use of either a size-dependent or a size-independent perspective; (3) the suitability of different measures of agreement; and (4) the uncertainty in peer review. In the context of the REF, we argue that agreement between metrics and peer review should be assessed at the institutional level rather than at the publication level. Both a size-dependent and a size-independent perspective are relevant in the REF. The interpretation of correlations may be problematic and as an alternative we therefore use measures of agreement that are based on the absolute or relative differences between metrics and peer review. To get an idea of the uncertainty in peer review, we rely on a model to bootstrap peer review outcomes. We conclude that particularly in Physics, Clinical Medicine, and Public Health, metrics agree relatively well with peer review and may offer an alternative to peer review.

Suggested Citation

  • V. A. Traag & L. Waltman, 2019. "Systematic analysis of agreement between metrics and peer review in the UK REF," Palgrave Communications, Palgrave Macmillan, vol. 5(1), pages 1-12, December.
  • Handle: RePEc:pal:palcom:v:5:y:2019:i:1:d:10.1057_s41599-019-0233-x
    DOI: 10.1057/s41599-019-0233-x
    as

    Download full text from publisher

    File URL: http://link.springer.com/10.1057/s41599-019-0233-x
    File Function: Abstract
    Download Restriction: Access to full text is restricted to subscribers.

    File URL: https://libkey.io/10.1057/s41599-019-0233-x?utm_source=ideas
    LibKey link: if access is restricted and if your library uses this service, LibKey will redirect you to where you can use your library subscription to access this item
    ---><---

    As the access to this document is restricted, you may want to search for a different version of it.

    References listed on IDEAS

    as
    1. Bertocchi, Graziella & Gambardella, Alfonso & Jappelli, Tullio & Nappi, Carmela A. & Peracchi, Franco, 2015. "Bibliometric evaluation vs. informed peer review: Evidence from Italy," Research Policy, Elsevier, vol. 44(2), pages 451-466.
    2. Norris, Michael & Oppenheim, Charles, 2010. "Peer review and the h-index: Two studies," Journal of Informetrics, Elsevier, vol. 4(3), pages 221-232.
    3. Steven Wooding & Thed N Van Leeuwen & Sarah Parks & Shitij Kapur & Jonathan Grant, 2015. "UK Doubles Its “World-Leading” Research in Life Sciences and Medicine in Six Years: Testing the Claim?," PLOS ONE, Public Library of Science, vol. 10(7), pages 1-10, July.
    4. Ludo Waltman & Nees Jan Eck, 2012. "A new methodology for constructing a publication-level classification system of science," Journal of the Association for Information Science & Technology, Association for Information Science & Technology, vol. 63(12), pages 2378-2392, December.
    5. Jevin West & Theodore Bergstrom & Carl T. Bergstrom, 2010. "Big Macs and Eigenfactor scores: Don't let correlation coefficients fool you," Journal of the American Society for Information Science and Technology, Association for Information Science & Technology, vol. 61(9), pages 1800-1807, September.
    6. Anne-Wil Harzing & Satu Alakangas, 2017. "Microsoft Academic is one year old: the Phoenix is ready to leave the nest," Scientometrics, Springer;Akadémiai Kiadó, vol. 112(3), pages 1887-1894, September.
    7. Hicks, Diana, 2012. "Performance-based university research funding systems," Research Policy, Elsevier, vol. 41(2), pages 251-261.
    8. Jim Taylor, "undated". "Measuring Research Performance in Business Managment Studies in the United Kingdom: the 1992 Research Assessment Exercise," Working Papers ec15/94, Department of Economics, University of Lancaster.
    9. Ludo Waltman & Nees Jan van Eck, 2012. "A new methodology for constructing a publication‐level classification system of science," Journal of the American Society for Information Science and Technology, Association for Information Science & Technology, vol. 63(12), pages 2378-2392, December.
    10. Ruiz-Castillo, Javier & Waltman, Ludo, 2015. "Field-normalized citation impact indicators using algorithmically constructed classification systems of science," Journal of Informetrics, Elsevier, vol. 9(1), pages 102-117.
    11. Anne-Wil Harzing & Satu Alakangas, 2017. "Microsoft Academic: is the phoenix getting wings?," Scientometrics, Springer;Akadémiai Kiadó, vol. 110(1), pages 371-383, January.
    12. Jill Johnes & Jim Taylor & Brian Francis, 1993. "The Research Performance of UK Universities: A Statistical Analysis of the Results of the 1989 Research Selectivity Exercise," Journal of the Royal Statistical Society Series A, Royal Statistical Society, vol. 156(2), pages 271-286, March.
    13. Linda Butler & Ian McAllister, 2009. "Metrics or Peer Review? Evaluating the 2001 UK Research Assessment Exercise in Political Science," Political Studies Review, Political Studies Association, vol. 7(1), pages 3-17, January.
    14. Jonathan Adams & Karen Gurney & Louise Jackson, 2008. "Calibrating the zoom — a test of Zitt’s hypothesis," Scientometrics, Springer;Akadémiai Kiadó, vol. 75(1), pages 81-95, April.
    Full references (including those not matched with items on IDEAS)

    Citations

    Citations are extracted by the CitEc Project, subscribe to its RSS feed for this item.
    as


    Cited by:

    1. Rodríguez-Navarro, Alonso & Brito, Ricardo, 2024. "Rank analysis of most cited publications, a new approach for research assessments," Journal of Informetrics, Elsevier, vol. 18(2).
    2. Thelwall, Mike & Kousha, Kayvan & Stuart, Emma & Makita, Meiko & Abdoli, Mahshid & Wilson, Paul & Levitt, Jonathan, 2023. "Do bibliometrics introduce gender, institutional or interdisciplinary biases into research evaluations?," Research Policy, Elsevier, vol. 52(8).
    3. Alonso Rodríguez-Navarro & Ricardo Brito, 2022. "The link between countries’ economic and scientific wealth has a complex dependence on technological activity and research policy," Scientometrics, Springer;Akadémiai Kiadó, vol. 127(5), pages 2871-2896, May.
    4. Noémie Aubert Bonn & Wim Pinxten, 2021. "Advancing science or advancing careers? Researchers’ opinions on success indicators," PLOS ONE, Public Library of Science, vol. 16(2), pages 1-17, February.
    5. Raminta Pranckutė, 2021. "Web of Science (WoS) and Scopus: The Titans of Bibliographic Information in Today’s Academic World," Publications, MDPI, vol. 9(1), pages 1-59, March.
    6. Alberto Baccini & Lucio Barabesi & Giuseppe De Nicolao, 2020. "On the agreement between bibliometrics and peer review: Evidence from the Italian research assessment exercises," PLOS ONE, Public Library of Science, vol. 15(11), pages 1-28, November.
    7. Lutz Bornmann & Julian N. Marewski, 2019. "Heuristics as conceptual lens for understanding and studying the usage of bibliometrics in research evaluation," Scientometrics, Springer;Akadémiai Kiadó, vol. 120(2), pages 419-459, August.
    8. Anthony F. J. Raan, 2021. "Laudation on the occasion of the presentation of the Derek de Solla Price Award 2021 to Prof. Ludo Waltman at the ISSI conference, Leuven, 2021," Scientometrics, Springer;Akadémiai Kiadó, vol. 126(10), pages 8235-8238, October.
    9. Brito, Ricardo & Navarro, Alonso Rodríguez, 2021. "The inconsistency of h-index: A mathematical analysis," Journal of Informetrics, Elsevier, vol. 15(1).
    10. Daniela Filippo & Rafael Aleixandre-Benavent & Elías Sanz-Casado, 2020. "Toward a classification of Spanish scholarly journals in social sciences and humanities considering their impact and visibility," Scientometrics, Springer;Akadémiai Kiadó, vol. 125(2), pages 1709-1732, November.
    11. Rosa Grimaldi & Martin Kenney & Andrea Piccaluga, 2021. "University technology transfer, regional specialization and local dynamics: lessons from Italy," The Journal of Technology Transfer, Springer, vol. 46(4), pages 855-865, August.
    12. Giovanni Abramo & Ciriaco Andrea D’Angelo & Emanuela Reale, 2019. "Peer review versus bibliometrics: Which method better predicts the scholarly impact of publications?," Scientometrics, Springer;Akadémiai Kiadó, vol. 121(1), pages 537-554, October.
    13. Erich Battistin & Marco Ovidi, 2022. "Rising Stars: Expert Reviews and Reputational Yardsticks in the Research Excellence Framework," Economica, London School of Economics and Political Science, vol. 89(356), pages 830-848, October.

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Waltman, Ludo, 2016. "A review of the literature on citation impact indicators," Journal of Informetrics, Elsevier, vol. 10(2), pages 365-391.
    2. Dunaiski, Marcel & Geldenhuys, Jaco & Visser, Willem, 2019. "Globalised vs averaged: Bias and ranking performance on the author level," Journal of Informetrics, Elsevier, vol. 13(1), pages 299-313.
    3. Robin Haunschild & Sven E. Hug & Martin P. Brändle & Lutz Bornmann, 2018. "The number of linked references of publications in Microsoft Academic in comparison with the Web of Science," Scientometrics, Springer;Akadémiai Kiadó, vol. 114(1), pages 367-370, January.
    4. Perianes-Rodriguez, Antonio & Ruiz-Castillo, Javier, 2017. "A comparison of the Web of Science and publication-level classification systems of science," Journal of Informetrics, Elsevier, vol. 11(1), pages 32-45.
    5. Bornmann, Lutz & Haunschild, Robin, 2022. "Empirical analysis of recent temporal dynamics of research fields: Annual publications in chemistry and related areas as an example," Journal of Informetrics, Elsevier, vol. 16(2).
    6. Ruiz-Castillo, Javier & Costas, Rodrigo, 2018. "Individual and field citation distributions in 29 broad scientific fields," Journal of Informetrics, Elsevier, vol. 12(3), pages 868-892.
    7. Ricardo Arencibia-Jorge & Rosa Lidia Vega-Almeida & José Luis Jiménez-Andrade & Humberto Carrillo-Calvet, 2022. "Evolutionary stages and multidisciplinary nature of artificial intelligence research," Scientometrics, Springer;Akadémiai Kiadó, vol. 127(9), pages 5139-5158, September.
    8. Alfonso Ávila-Robinson & Cristian Mejia & Shintaro Sengoku, 2021. "Are bibliometric measures consistent with scientists’ perceptions? The case of interdisciplinarity in research," Scientometrics, Springer;Akadémiai Kiadó, vol. 126(9), pages 7477-7502, September.
    9. Ruiz-Castillo, Javier & Costas, Rodrigo, 2014. "The skewness of scientific productivity," Journal of Informetrics, Elsevier, vol. 8(4), pages 917-934.
    10. Hu, Zhigang & Tian, Wencan & Xu, Shenmeng & Zhang, Chunbo & Wang, Xianwen, 2018. "Four pitfalls in normalizing citation indicators: An investigation of ESI’s selection of highly cited papers," Journal of Informetrics, Elsevier, vol. 12(4), pages 1133-1145.
    11. Ruiz-Castillo, Javier & Waltman, Ludo, 2015. "Field-normalized citation impact indicators using algorithmically constructed classification systems of science," Journal of Informetrics, Elsevier, vol. 9(1), pages 102-117.
    12. De Fraja, Gianni & Facchini, Giovanni & Gathergood, John, 2016. "How Much Is That Star in the Window? Professorial Salaries and Research Performance in UK Universities," CEPR Discussion Papers 11638, C.E.P.R. Discussion Papers.
    13. Abramo, Giovanni & D’Angelo, Ciriaco Andrea & Zhang, Lin, 2018. "A comparison of two approaches for measuring interdisciplinary research output: The disciplinary diversity of authors vs the disciplinary diversity of the reference list," Journal of Informetrics, Elsevier, vol. 12(4), pages 1182-1193.
    14. Antonio Perianes-Rodriguez & Javier Ruiz-Castillo, 2016. "University citation distributions," Journal of the Association for Information Science & Technology, Association for Information Science & Technology, vol. 67(11), pages 2790-2804, November.
    15. Confraria, Hugo & Ciarli, Tommaso & Noyons, Ed, 2024. "Countries' research priorities in relation to the Sustainable Development Goals," Research Policy, Elsevier, vol. 53(3).
    16. Antonio Perianes-Rodriguez & Javier Ruiz-Castillo, 2016. "A comparison of two ways of evaluating research units working in different scientific fields," Scientometrics, Springer;Akadémiai Kiadó, vol. 106(2), pages 539-561, February.
    17. Bouyssou, Denis & Marchant, Thierry, 2016. "Ranking authors using fractional counting of citations: An axiomatic approach," Journal of Informetrics, Elsevier, vol. 10(1), pages 183-199.
    18. Dunaiski, Marcel & Geldenhuys, Jaco & Visser, Willem, 2019. "On the interplay between normalisation, bias, and performance of paper impact metrics," Journal of Informetrics, Elsevier, vol. 13(1), pages 270-290.
    19. Carusi, Chiara & Bianchi, Giuseppe, 2019. "Scientific community detection via bipartite scholar/journal graph co-clustering," Journal of Informetrics, Elsevier, vol. 13(1), pages 354-386.
    20. Mingers, John & Leydesdorff, Loet, 2015. "A review of theory and practice in scientometrics," European Journal of Operational Research, Elsevier, vol. 246(1), pages 1-19.

    More about this item

    Statistics

    Access and download statistics

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:pal:palcom:v:5:y:2019:i:1:d:10.1057_s41599-019-0233-x. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: Sonal Shukla or Springer Nature Abstracting and Indexing (email available below). General contact details of provider: https://www.nature.com/ .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.