IDEAS home Printed from https://ideas.repec.org/a/eee/infome/v5y2011i2p275-291.html
   My bibliography  Save this article

The first Italian research assessment exercise: A bibliometric perspective

Author

Listed:
  • Franceschet, Massimo
  • Costantini, Antonio

Abstract

In December 2003, seventeen years after the first UK research assessment exercise, Italy started up its first-ever national research evaluation, with the aim to evaluate, using the peer review method, the excellence of the national research production. The evaluation involved 20 disciplinary areas, 102 research structures, 18,500 research products and 6661 peer reviewers (1465 from abroad); it had a direct cost of 3.55 millions Euros and a time length spanning over 18 months. The introduction of ratings based on ex post quality of output and not on ex ante respect for parameters and compliance is an important leap forward of the national research evaluation system toward meritocracy. From the bibliometric perspective, the national assessment offered the unprecedented opportunity to perform a large-scale comparison of peer review and bibliometric indicators for an important share of the Italian research production. The present investigation takes full advantage of this opportunity to test whether peer review judgements and (article and journal) bibliometric indicators are independent variables and, in the negative case, to measure the sign and strength of the association. Outcomes allow us to advocate the use of bibliometric evaluation, suitably integrated with expert review, for the forthcoming national assessment exercises, with the goal of shifting from the assessment of research excellence to the evaluation of average research performance without significant increase of expenses.

Suggested Citation

  • Franceschet, Massimo & Costantini, Antonio, 2011. "The first Italian research assessment exercise: A bibliometric perspective," Journal of Informetrics, Elsevier, vol. 5(2), pages 275-291.
  • Handle: RePEc:eee:infome:v:5:y:2011:i:2:p:275-291
    DOI: 10.1016/j.joi.2010.12.002
    as

    Download full text from publisher

    File URL: http://www.sciencedirect.com/science/article/pii/S1751157710001008
    Download Restriction: Full text for ScienceDirect subscribers only

    File URL: https://libkey.io/10.1016/j.joi.2010.12.002?utm_source=ideas
    LibKey link: if access is restricted and if your library uses this service, LibKey will redirect you to where you can use your library subscription to access this item
    ---><---

    As the access to this document is restricted, you may want to search for a different version of it.

    References listed on IDEAS

    as
    1. Linda Butler & Ian McAllister, 2009. "Metrics or Peer Review? Evaluating the 2001 UK Research Assessment Exercise in Political Science," Political Studies Review, Political Studies Association, vol. 7(1), pages 3-17, January.
    2. E. Garfield & I. H. Sher, 1963. "New factors in the evaluation of scientific literature through citation indexing," American Documentation, Wiley Blackwell, vol. 14(3), pages 195-201, July.
    3. Michael H. MacRoberts & Barbara R. MacRoberts, 1989. "Problems of citation analysis: A critical review," Journal of the American Society for Information Science, Association for Information Science & Technology, vol. 40(5), pages 342-349, September.
    4. Benjamin M. Althouse & Jevin D. West & Carl T. Bergstrom & Theodore Bergstrom, 2009. "Differences in impact factor across fields and over time," Journal of the American Society for Information Science and Technology, Association for Information Science & Technology, vol. 60(1), pages 27-34, January.
    5. Michael J Stringer & Marta Sales-Pardo & Luís A Nunes Amaral, 2008. "Effectiveness of Journal Ranking Schemes as a Tool for Locating Information," PLOS ONE, Public Library of Science, vol. 3(2), pages 1-8, February.
    6. Bornmann, Lutz & Daniel, Hans-Dieter, 2007. "Convergent validation of peer review decisions using the h index," Journal of Informetrics, Elsevier, vol. 1(3), pages 204-213.
    7. Massimo Franceschet, 2010. "A comparison of bibliometric indicators for computer science scholars and journals on Web of Science and Google Scholar," Scientometrics, Springer;Akadémiai Kiadó, vol. 83(1), pages 243-258, April.
    8. Martin, Ben R. & Irvine, John, 1993. "Assessing basic research : Some partial indicators of scientific progress in radio astronomy," Research Policy, Elsevier, vol. 22(2), pages 106-106, April.
    9. Lutz Bornmann & Hans‐Dieter Daniel, 2007. "What do we know about the h index?," Journal of the American Society for Information Science and Technology, Association for Information Science & Technology, vol. 58(9), pages 1381-1385, July.
    10. Franceschet, Massimo, 2010. "The difference between popularity and prestige in the sciences and in the social sciences: A bibliometric analysis," Journal of Informetrics, Elsevier, vol. 4(1), pages 55-63.
    11. Leo Egghe & Ronald Rousseau, 2006. "An informetric model for the Hirsch-index," Scientometrics, Springer;Akadémiai Kiadó, vol. 69(1), pages 121-129, October.
    12. Eliana Minelli & Gianfranco Rebora & Matteo Turri, 2008. "The Structure and Significance of the Italian Research Assessment Exercise (VTR)," Chapters, in: Carmelo Mazza & Paolo Quattrone & Angelo Riccaboni (ed.), European Universities in Transition, chapter 12, Edward Elgar Publishing.
    13. Anthony F. J. van Raan, 2004. "Sleeping Beauties in science," Scientometrics, Springer;Akadémiai Kiadó, vol. 59(3), pages 467-472, March.
    14. Rinia, E. J. & van Leeuwen, Th. N. & van Vuren, H. G. & van Raan, A. F. J., 1998. "Comparative analysis of a set of bibliometric indicators and central peer review criteria: Evaluation of condensed matter physics in the Netherlands," Research Policy, Elsevier, vol. 27(1), pages 95-107, May.
    15. Abramo, Giovanni & D'Angelo, Ciriaco Andrea & Caprasecca, Alessandro, 2009. "Allocative efficiency in public research funding: Can bibliometrics help?," Research Policy, Elsevier, vol. 38(1), pages 206-215, February.
    16. Franceschet, Massimo & Costantini, Antonio, 2010. "The effect of scholar collaboration on impact and quality of academic papers," Journal of Informetrics, Elsevier, vol. 4(4), pages 540-553.
    17. Philip Ball, 2007. "Achievement index climbs the ranks," Nature, Nature, vol. 448(7155), pages 737-737, August.
    18. Blaise Cronin, 2001. "Hyperauthorship: A postmodern perversion or evidence of a structural shift in scholarly communication practices?," Journal of the American Society for Information Science and Technology, Association for Information Science & Technology, vol. 52(7), pages 558-569.
    19. Anthony F. J. Raan, 2006. "Comparison of the Hirsch-index with standard bibliometric indicators and with peer judgment for 147 chemistry research groups," Scientometrics, Springer;Akadémiai Kiadó, vol. 67(3), pages 491-502, June.
    20. Per O. Seglen, 1992. "The skewness of science," Journal of the American Society for Information Science, Association for Information Science & Technology, vol. 43(9), pages 628-638, October.
    21. Alonso, S. & Cabrerizo, F.J. & Herrera-Viedma, E. & Herrera, F., 2009. "h-Index: A review focused in its variants, computation and standardization for different scientific fields," Journal of Informetrics, Elsevier, vol. 3(4), pages 273-289.
    22. Dag W Aksnes & Randi Elisabeth Taxt, 2004. "Peer reviews and bibliometric indicators: a comparative study at a Norwegian university," Research Evaluation, Oxford University Press, vol. 13(1), pages 33-41, April.
    23. Emanuela Reale & Anna Barbara & Antonio Costantini, 2007. "Peer review for the evaluation of academic research: lessons from the Italian experience," Research Evaluation, Oxford University Press, vol. 16(3), pages 216-228, September.
    Full references (including those not matched with items on IDEAS)

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Mingers, John & Leydesdorff, Loet, 2015. "A review of theory and practice in scientometrics," European Journal of Operational Research, Elsevier, vol. 246(1), pages 1-19.
    2. Waltman, Ludo, 2016. "A review of the literature on citation impact indicators," Journal of Informetrics, Elsevier, vol. 10(2), pages 365-391.
    3. Dag W. Aksnes & Liv Langfeldt & Paul Wouters, 2019. "Citations, Citation Indicators, and Research Quality: An Overview of Basic Concepts and Theories," SAGE Open, , vol. 9(1), pages 21582440198, February.
    4. Vieira, Elizabeth S. & Cabral, José A.S. & Gomes, José A.N.F., 2014. "How good is a model based on bibliometric indicators in predicting the final decisions made by peers?," Journal of Informetrics, Elsevier, vol. 8(2), pages 390-405.
    5. Li, Jiang & Sanderson, Mark & Willett, Peter & Norris, Michael & Oppenheim, Charles, 2010. "Ranking of library and information science researchers: Comparison of data sources for correlating citation data, and expert judgments," Journal of Informetrics, Elsevier, vol. 4(4), pages 554-563.
    6. Mingers, John & Yang, Liying, 2017. "Evaluating journal quality: A review of journal citation indicators and ranking in business and management," European Journal of Operational Research, Elsevier, vol. 257(1), pages 323-337.
    7. Giovanni Abramo & Ciriaco Andrea D’Angelo & Emanuela Reale, 2019. "Peer review versus bibliometrics: Which method better predicts the scholarly impact of publications?," Scientometrics, Springer;Akadémiai Kiadó, vol. 121(1), pages 537-554, October.
    8. Fedderke, J.W. & Goldschmidt, M., 2015. "Does massive funding support of researchers work?: Evaluating the impact of the South African research chair funding initiative," Research Policy, Elsevier, vol. 44(2), pages 467-482.
    9. Bar-Ilan, Judit, 2008. "Informetrics at the beginning of the 21st century—A review," Journal of Informetrics, Elsevier, vol. 2(1), pages 1-52.
    10. Abramo, Giovanni & Cicero, Tindaro & D’Angelo, Ciriaco Andrea, 2011. "Assessing the varying level of impact measurement accuracy as a function of the citation window length," Journal of Informetrics, Elsevier, vol. 5(4), pages 659-667.
    11. John Panaretos & Chrisovaladis Malesios, 2009. "Assessing scientific research performance and impact with single indices," Scientometrics, Springer;Akadémiai Kiadó, vol. 81(3), pages 635-670, December.
    12. Jacques Wainer & Paula Vieira, 2013. "Correlations between bibliometrics and peer evaluation for all disciplines: the evaluation of Brazilian scientists," Scientometrics, Springer;Akadémiai Kiadó, vol. 96(2), pages 395-410, August.
    13. Filippo Radicchi & Claudio Castellano, 2013. "Analysis of bibliometric indicators for individual scholars in a large data set," Scientometrics, Springer;Akadémiai Kiadó, vol. 97(3), pages 627-637, December.
    14. Abramo, Giovanni, 2018. "Revisiting the scientometric conceptualization of impact and its measurement," Journal of Informetrics, Elsevier, vol. 12(3), pages 590-597.
    15. Bornmann, Lutz & Marx, Werner, 2012. "HistCite analysis of papers constituting the h index research front," Journal of Informetrics, Elsevier, vol. 6(2), pages 285-288.
    16. Petridis, Konstantinos & Malesios, Chrisovalantis & Arabatzis, Garyfallos & Thanassoulis, Emmanuel, 2013. "Efficiency analysis of forestry journals: Suggestions for improving journals’ quality," Journal of Informetrics, Elsevier, vol. 7(2), pages 505-521.
    17. Anna Tietze & Philip Hofmann, 2019. "The h-index and multi-author hm-index for individual researchers in condensed matter physics," Scientometrics, Springer;Akadémiai Kiadó, vol. 119(1), pages 171-185, April.
    18. Brito, Ricardo & Rodríguez-Navarro, Alonso, 2018. "Research assessment by percentile-based double rank analysis," Journal of Informetrics, Elsevier, vol. 12(1), pages 315-329.
    19. Leo Egghe & Ronald Rousseau, 2021. "The h-index formalism," Scientometrics, Springer;Akadémiai Kiadó, vol. 126(7), pages 6137-6145, July.
    20. Giovanni Abramo & Ciriaco Andrea D'Angelo, 2015. "The VQR, Italy's second national research assessment: Methodological failures and ranking distortions," Journal of the Association for Information Science & Technology, Association for Information Science & Technology, vol. 66(11), pages 2202-2214, November.

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:eee:infome:v:5:y:2011:i:2:p:275-291. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: Catherine Liu (email available below). General contact details of provider: http://www.elsevier.com/locate/joi .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.