IDEAS home Printed from https://ideas.repec.org/a/spr/scient/v127y2022i5d10.1007_s11192-022-04341-6.html
   My bibliography  Save this article

Bibliometrics in Press. Representations and uses of bibliometric indicators in the Italian daily newspapers

Author

Listed:
  • Eugenio Petrovich

    (University of Siena)

Abstract

Scholars in science and technology studies and bibliometricians are increasingly revealing the performative nature of bibliometric indicators. Far from being neutral technical measures, indicators such as the Impact Factor and the h-index are deeply transforming the social and epistemic structures of contemporary science. At the same time, scholars have highlighted how bibliometric indicators are endowed with social meanings that go beyond their purely technical definitions. These social representations of bibliometric indicators are constructed and negotiated between different groups of actors within several arenas. This study aims to investigate how bibliometric indicators are used in a context, which, so far, has not yet been covered by researchers, that of daily newspapers. By a content analysis of a corpus of 583 articles that appeared in four major Italian newspapers between 1990 and 2020, we chronicle the main functions that bibliometrics and bibliometric indicators played in the Italian press. Our material shows, among other things, that the public discourse developed in newspapers creates a favorable environment for bibliometrics-centered science policies, that bibliometric indicators contribute to the social construction of scientific facts in the press, especially in science news related to medicine, and that professional bibliometric expertise struggles to be represented in newspapers and hence reach the general public.

Suggested Citation

  • Eugenio Petrovich, 2022. "Bibliometrics in Press. Representations and uses of bibliometric indicators in the Italian daily newspapers," Scientometrics, Springer;Akadémiai Kiadó, vol. 127(5), pages 2195-2233, May.
  • Handle: RePEc:spr:scient:v:127:y:2022:i:5:d:10.1007_s11192-022-04341-6
    DOI: 10.1007/s11192-022-04341-6
    as

    Download full text from publisher

    File URL: http://link.springer.com/10.1007/s11192-022-04341-6
    File Function: Abstract
    Download Restriction: Access to the full text of the articles in this series is restricted.

    File URL: https://libkey.io/10.1007/s11192-022-04341-6?utm_source=ideas
    LibKey link: if access is restricted and if your library uses this service, LibKey will redirect you to where you can use your library subscription to access this item
    ---><---

    As the access to this document is restricted, you may want to search for a different version of it.

    References listed on IDEAS

    as
    1. Loet Leydesdorff & Paul Wouters & Lutz Bornmann, 2016. "Professional and citizen bibliometrics: complementarities and ambivalences in the development and use of indicators—a state-of-the-art report," Scientometrics, Springer;Akadémiai Kiadó, vol. 109(3), pages 2129-2150, December.
    2. Aant Elzinga, 2012. "Features of the current science policy regime: Viewed in historical perspective," Science and Public Policy, Oxford University Press, vol. 39(4), pages 416-428, August.
    3. Ludo Waltman & Nees Jan van Eck, 2012. "The inconsistency of the h-index," Journal of the Association for Information Science & Technology, Association for Information Science & Technology, vol. 63(2), pages 406-415, February.
    4. Björn Hammarfelt & Alexander D. Rushforth, 2017. "Indicators as judgment devices: An empirical study of citizen bibliometrics in research evaluation," Research Evaluation, Oxford University Press, vol. 26(3), pages 169-180.
    5. Gualberto Buela-Casal & Izabela Zych, 2012. "What do the scientists think about the impact factor?," Scientometrics, Springer;Akadémiai Kiadó, vol. 92(2), pages 281-292, August.
    6. Ruth Müller & Sarah de Rijcke, 2017. "Thinking with Indicators. Exploring the Epistemic Impacts of Academic Performance Indicators in the Life Sciences," Research Evaluation, Oxford University Press, vol. 26(4), pages 361-361.
    7. Ludo Waltman & Nees Jan van Eck, 2012. "The inconsistency of the h‐index," Journal of the American Society for Information Science and Technology, Association for Information Science & Technology, vol. 63(2), pages 406-415, February.
    8. Ruth Müller & Sarah de Rijcke, 2017. "Exploring the epistemic impacts of academic performance indicators in the life sciences," Research Evaluation, Oxford University Press, vol. 26(3), pages 157-168.
    9. Aksnes, Dag W. & Rip, Arie, 2009. "Researchers' perceptions of citations," Research Policy, Elsevier, vol. 38(6), pages 895-905, July.
    10. Sabrina Petersohn & Thomas Heinze, 2018. "Professionalization of bibliometric research assessment. Insights from the history of the Leiden Centre for Science and Technology Studies (CWTS)," Science and Public Policy, Oxford University Press, vol. 45(4), pages 596-596.
    11. Diana Hicks & Paul Wouters & Ludo Waltman & Sarah de Rijcke & Ismael Rafols, 2015. "Bibliometrics: The Leiden Manifesto for research metrics," Nature, Nature, vol. 520(7548), pages 429-431, April.
    12. Anton J. Nederhof, 2006. "Bibliometric monitoring of research performance in the Social Sciences and the Humanities: A Review," Scientometrics, Springer;Akadémiai Kiadó, vol. 66(1), pages 81-100, January.
    13. Sarah de Rijcke & Paul F. Wouters & Alex D. Rushforth & Thomas P. Franssen & Björn Hammarfelt, 2016. "Evaluation practices and effects of indicator use—a literature review," Research Evaluation, Oxford University Press, vol. 25(2), pages 161-169.
    14. Lutz Bornmann & Werner Marx, 2014. "How to evaluate individual researchers working in the natural and life sciences meaningfully? A proposal of methods based on percentiles of citations," Scientometrics, Springer;Akadémiai Kiadó, vol. 98(1), pages 487-509, January.
    15. Horenberg, Frank & Lungu, Daniel Adrian & Nuti, Sabina, 2020. "Measuring research in the big data era: The evolution of performance measurement systems in the Italian teaching hospitals," Health Policy, Elsevier, vol. 124(12), pages 1387-1394.
    16. Peter Weingart, 2005. "Impact of bibliometrics upon the science system: Inadvertent consequences?," Scientometrics, Springer;Akadémiai Kiadó, vol. 62(1), pages 117-131, January.
    17. Joost Kosten, 2016. "A classification of the use of research indicators," Scientometrics, Springer;Akadémiai Kiadó, vol. 108(1), pages 457-464, July.
    18. Sabrina Petersohn & Thomas Heinze, 2018. "Professionalization of bibliometric research assessment. Insights from the history of the Leiden Centre for Science and Technology Studies (CWTS)," Science and Public Policy, Oxford University Press, vol. 45(4), pages 565-578.
    19. Hicks, Diana, 2012. "Performance-based university research funding systems," Research Policy, Elsevier, vol. 41(2), pages 251-261.
    20. Jerome K. Vanclay, 2012. "Impact factor: outdated artefact or stepping-stone to journal certification?," Scientometrics, Springer;Akadémiai Kiadó, vol. 92(2), pages 211-238, August.
    21. Björn Hammarfelt & Sarah de Rijcke, 2015. "Accountability in context: effects of research evaluation systems on publication practices, disciplinary norms, and individual working routines in the faculty of Arts at Uppsala University," Research Evaluation, Oxford University Press, vol. 24(1), pages 63-77.
    22. Waltman, Ludo, 2016. "A review of the literature on citation impact indicators," Journal of Informetrics, Elsevier, vol. 10(2), pages 365-391.
    Full references (including those not matched with items on IDEAS)

    Citations

    Citations are extracted by the CitEc Project, subscribe to its RSS feed for this item.
    as


    Cited by:

    1. Evelyn Eika & Frode Eika Sandnes, 2022. "Starstruck by journal prestige and citation counts? On students’ bias and perceptions of trustworthiness according to clues in publication references," Scientometrics, Springer;Akadémiai Kiadó, vol. 127(11), pages 6363-6390, November.

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Lutz Bornmann & Julian N. Marewski, 2019. "Heuristics as conceptual lens for understanding and studying the usage of bibliometrics in research evaluation," Scientometrics, Springer;Akadémiai Kiadó, vol. 120(2), pages 419-459, August.
    2. Anne K. Krüger, 2020. "Quantification 2.0? Bibliometric Infrastructures in Academic Evaluation," Politics and Governance, Cogitatio Press, vol. 8(2), pages 58-67.
    3. Gabriel-Alexandru Vȋiu & Mihai Păunescu, 2021. "The lack of meaningful boundary differences between journal impact factor quartiles undermines their independent use in research evaluation," Scientometrics, Springer;Akadémiai Kiadó, vol. 126(2), pages 1495-1525, February.
    4. Loet Leydesdorff & Paul Wouters & Lutz Bornmann, 2016. "Professional and citizen bibliometrics: complementarities and ambivalences in the development and use of indicators—a state-of-the-art report," Scientometrics, Springer;Akadémiai Kiadó, vol. 109(3), pages 2129-2150, December.
    5. Ramón A. Feenstra & Emilio Delgado López-Cózar, 2022. "Philosophers’ appraisals of bibliometric indicators and their use in evaluation: from recognition to knee-jerk rejection," Scientometrics, Springer;Akadémiai Kiadó, vol. 127(4), pages 2085-2103, April.
    6. Sven Helmer & David B. Blumenthal & Kathrin Paschen, 2020. "What is meaningful research and how should we measure it?," Scientometrics, Springer;Akadémiai Kiadó, vol. 125(1), pages 153-169, October.
    7. David A. Pendlebury, 2019. "Charting a path between the simple and the false and the complex and unusable: Review of Henk F. Moed, Applied Evaluative Informetrics [in the series Qualitative and Quantitative Analysis of Scientifi," Scientometrics, Springer;Akadémiai Kiadó, vol. 119(1), pages 549-560, April.
    8. Alessandro Margherita & Gianluca Elia & Claudio Petti, 2022. "What Is Quality in Research? Building a Framework of Design, Process and Impact Attributes and Evaluation Perspectives," Sustainability, MDPI, vol. 14(5), pages 1-18, March.
    9. Gabriel-Alexandru Vîiu & Mihai Păunescu, 2021. "The citation impact of articles from which authors gained monetary rewards based on journal metrics," Scientometrics, Springer;Akadémiai Kiadó, vol. 126(6), pages 4941-4974, June.
    10. Dag W. Aksnes & Liv Langfeldt & Paul Wouters, 2019. "Citations, Citation Indicators, and Research Quality: An Overview of Basic Concepts and Theories," SAGE Open, , vol. 9(1), pages 21582440198, February.
    11. Yves Fassin, 2020. "The HF-rating as a universal complement to the h-index," Scientometrics, Springer;Akadémiai Kiadó, vol. 125(2), pages 965-990, November.
    12. Gregorio González-Alcaide, 2021. "Bibliometric studies outside the information science and library science field: uncontainable or uncontrollable?," Scientometrics, Springer;Akadémiai Kiadó, vol. 126(8), pages 6837-6870, August.
    13. Pantea Kamrani & Isabelle Dorsch & Wolfgang G. Stock, 2021. "Do researchers know what the h-index is? And how do they estimate its importance?," Scientometrics, Springer;Akadémiai Kiadó, vol. 126(7), pages 5489-5508, July.
    14. Yang, Alex Jie & Wu, Linwei & Zhang, Qi & Wang, Hao & Deng, Sanhong, 2023. "The k-step h-index in citation networks at the paper, author, and institution levels," Journal of Informetrics, Elsevier, vol. 17(4).
    15. Yurij L. Katchanov & Yulia V. Markova, 2017. "The “space of physics journals”: topological structure and the Journal Impact Factor," Scientometrics, Springer;Akadémiai Kiadó, vol. 113(1), pages 313-333, October.
    16. Loet Leydesdorff & Lutz Bornmann & Jonathan Adams, 2019. "The integrated impact indicator revisited (I3*): a non-parametric alternative to the journal impact factor," Scientometrics, Springer;Akadémiai Kiadó, vol. 119(3), pages 1669-1694, June.
    17. Brito, Ricardo & Rodríguez-Navarro, Alonso, 2019. "Evaluating research and researchers by the journal impact factor: Is it better than coin flipping?," Journal of Informetrics, Elsevier, vol. 13(1), pages 314-324.
    18. Lutz Bornmann & Julian N. Marewski, 2024. "Opium in science and society: numbers and other quantifications," Scientometrics, Springer;Akadémiai Kiadó, vol. 129(9), pages 5313-5346, September.
    19. Frank J. Rijnsoever & Laurens K. Hessels, 2021. "How academic researchers select collaborative research projects: a choice experiment," The Journal of Technology Transfer, Springer, vol. 46(6), pages 1917-1948, December.
    20. Elías Sanz-Casado & Daniela Filippo & Rafael Aleixandre Benavent & Vidar Røeggen & Janne Pölönen, 2021. "Impact and visibility of Norwegian, Finnish and Spanish journals in the fields of humanities," Scientometrics, Springer;Akadémiai Kiadó, vol. 126(11), pages 9031-9049, November.

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:spr:scient:v:127:y:2022:i:5:d:10.1007_s11192-022-04341-6. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: Sonal Shukla or Springer Nature Abstracting and Indexing (email available below). General contact details of provider: http://www.springer.com .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.