IDEAS home Printed from https://ideas.repec.org/a/spr/scient/v129y2024i9d10.1007_s11192-024-05104-1.html
   My bibliography  Save this article

Opium in science and society: numbers and other quantifications

Author

Listed:
  • Lutz Bornmann

    (Administrative Headquarters of the Max Planck Society)

  • Julian N. Marewski

    (University of Lausanne)

Abstract

In science and beyond, quantifications are omnipresent when it comes to justifying judgments. Which scientific author, hiring committee-member, or advisory board panelist has not been confronted with page-long publication manuals, assessment reports, evaluation guidelines, calling for p-values, citation rates, h-indices, or other numbers to judge about the ‘quality’ of findings, applicants, or institutions? Yet, many of those of us relying on and calling for quantifications may not understand what information numbers can convey, and what not. Focusing on the uninformed usage of bibliometrics as worrisome outgrowth of the increasing quantification of science, in this opinion essay we place the abuse of quantifications into historical contexts and trends. These are characterized by mistrust in human intuitive judgment, obsessions with control and accountability, and a bureaucratization of science. We call for bringing common sense back into scientific (bibliometric-based) judgment exercises. Despite all number crunching, many judgments—be it about empirical findings or research institutions—will neither be straightforward, clear, and unequivocal, nor can they be ‘validated’ and be ‘objectified’ by external standards. We conclude that assessments in science ought to be understood as and be made as judgments under uncertainty.

Suggested Citation

  • Lutz Bornmann & Julian N. Marewski, 2024. "Opium in science and society: numbers and other quantifications," Scientometrics, Springer;Akadémiai Kiadó, vol. 129(9), pages 5313-5346, September.
  • Handle: RePEc:spr:scient:v:129:y:2024:i:9:d:10.1007_s11192-024-05104-1
    DOI: 10.1007/s11192-024-05104-1
    as

    Download full text from publisher

    File URL: http://link.springer.com/10.1007/s11192-024-05104-1
    File Function: Abstract
    Download Restriction: Access to the full text of the articles in this series is restricted.

    File URL: https://libkey.io/10.1007/s11192-024-05104-1?utm_source=ideas
    LibKey link: if access is restricted and if your library uses this service, LibKey will redirect you to where you can use your library subscription to access this item
    ---><---

    As the access to this document is restricted, you may want to search for a different version of it.

    References listed on IDEAS

    as
    1. Sophie Biesenbender & Stefan Hornbostel, 2016. "The Research Core Dataset for the German science system: challenges, processes and principles of a contested standardization project," Scientometrics, Springer;Akadémiai Kiadó, vol. 106(2), pages 837-847, February.
    2. Herbert A. Simon, 1955. "A Behavioral Model of Rational Choice," The Quarterly Journal of Economics, President and Fellows of Harvard College, vol. 69(1), pages 99-118.
    3. Charles F. Manski, 2013. "Response to the Review of ‘Public Policy in an Uncertain World’," Economic Journal, Royal Economic Society, vol. 0, pages 412-415, August.
    4. Loet Leydesdorff & Paul Wouters & Lutz Bornmann, 2016. "Professional and citizen bibliometrics: complementarities and ambivalences in the development and use of indicators—a state-of-the-art report," Scientometrics, Springer;Akadémiai Kiadó, vol. 109(3), pages 2129-2150, December.
    5. James G. March, 1991. "Exploration and Exploitation in Organizational Learning," Organization Science, INFORMS, vol. 2(1), pages 71-87, February.
    6. James J. Heckman & Sidharth Moktan, 2020. "Publishing and promotion in economics - The tyranny of the Top Five," Vox eBook Chapters, in: Sebastian Galliani & Ugo Panizza (ed.), Publishing and Measuring Success in Economics, edition 1, volume 1, chapter 1, pages 23-32, Centre for Economic Policy Research.
    7. Wolfgang Glänzel & Henk F. Moed, 2013. "Opinion paper: thoughts and facts on bibliometric indicators," Scientometrics, Springer;Akadémiai Kiadó, vol. 96(1), pages 381-394, July.
    8. Ludo Waltman & Nees Jan van Eck, 2012. "The inconsistency of the h-index," Journal of the Association for Information Science & Technology, Association for Information Science & Technology, vol. 63(2), pages 406-415, February.
    9. Cruz-Castro, Laura & Sanz-Menendez, Luis, 2021. "What should be rewarded? Gender and evaluation criteria for tenure and promotion," Journal of Informetrics, Elsevier, vol. 15(3).
    10. Bornmann, Lutz & Ganser, Christian & Tekles, Alexander, 2022. "Simulation of the h index use at university departments within the bibliometrics-based heuristics framework: Can the indicator be used to compare individual researchers?," Journal of Informetrics, Elsevier, vol. 16(1).
    11. Henk F. Moed & Gali Halevi, 2015. "Multidimensional assessment of scholarly research impact," Journal of the Association for Information Science & Technology, Association for Information Science & Technology, vol. 66(10), pages 1988-2002, October.
    12. Sven E. Hug & Mirjam Aeschbach, 2020. "Criteria for assessing grant applications: a systematic review," Palgrave Communications, Palgrave Macmillan, vol. 6(1), pages 1-15, December.
    13. A. Bookstein, 1997. "Informetric distributions. III. Ambiguity and randomness," Journal of the American Society for Information Science, Association for Information Science & Technology, vol. 48(1), pages 2-10, January.
    14. Daniel J. Benjamin & James O. Berger & Magnus Johannesson & Brian A. Nosek & E.-J. Wagenmakers & Richard Berk & Kenneth A. Bollen & Björn Brembs & Lawrence Brown & Colin Camerer & David Cesarini & Chr, 2018. "Redefine statistical significance," Nature Human Behaviour, Nature, vol. 2(1), pages 6-10, January.
      • Daniel Benjamin & James Berger & Magnus Johannesson & Brian Nosek & E. Wagenmakers & Richard Berk & Kenneth Bollen & Bjorn Brembs & Lawrence Brown & Colin Camerer & David Cesarini & Christopher Chambe, 2017. "Redefine Statistical Significance," Artefactual Field Experiments 00612, The Field Experiments Website.
    15. Werner Marx & Lutz Bornmann, 2013. "The emergence of plate tectonics and the Kuhnian model of paradigm shift: a bibliometric case study based on the Anna Karenina principle," Scientometrics, Springer;Akadémiai Kiadó, vol. 94(2), pages 595-614, February.
    16. Martin, Ben R. & Irvine, John, 1993. "Assessing basic research : Some partial indicators of scientific progress in radio astronomy," Research Policy, Elsevier, vol. 22(2), pages 106-106, April.
    17. Diana Hicks & Paul Wouters & Ludo Waltman & Sarah de Rijcke & Ismael Rafols, 2015. "Bibliometrics: The Leiden Manifesto for research metrics," Nature, Nature, vol. 520(7548), pages 429-431, April.
    18. Andrew Gelman & Christian Hennig, 2017. "Beyond subjective and objective in statistics," Journal of the Royal Statistical Society Series A, Royal Statistical Society, vol. 180(4), pages 967-1033, October.
    19. Liv Langfeldt & Ingvild Reymert & Dag W Aksnes, 2021. "The role of metrics in peer assessments [How Incentives Trickle down: Local Use of a National Bibliometric Indicator System]," Research Evaluation, Oxford University Press, vol. 30(1), pages 112-126.
    20. John Antonakis & Samuel Bendahan & Philippe Jacquart & Rafael Lalive, 2010. "On making causal claims : A review and recommendations," Post-Print hal-02313119, HAL.
    21. Lorna Wildgaard & Jesper W. Schneider & Birger Larsen, 2014. "A review of the characteristics of 108 author-level bibliometric indicators," Scientometrics, Springer;Akadémiai Kiadó, vol. 101(1), pages 125-158, October.
    22. Konstantinos V. Katsikopoulos & Julian N. Marewski & Ulrich Hoffrage, 2024. "Heuristics for metascience: Simon and Popper," Chapters, in: Gerd Gigerenzer & Shabnam Mousavi & Riccardo Viale (ed.), Elgar Companion to Herbert Simon, chapter 14, pages 300-311, Edward Elgar Publishing.
    23. Tahamtan, Iman & Bornmann, Lutz, 2018. "Core elements in the process of citing publications: Conceptual overview of the literature," Journal of Informetrics, Elsevier, vol. 12(1), pages 203-216.
    24. Werner Marx & Lutz Bornmann, 2015. "On the causes of subject-specific citation rates in Web of Science," Scientometrics, Springer;Akadémiai Kiadó, vol. 102(2), pages 1823-1827, February.
    25. Eugenie Samuel Reich, 2013. "Science publishing: The golden club," Nature, Nature, vol. 502(7471), pages 291-293, October.
    26. Anthony F. J. Raan & Thed N. Leeuwen & Martijn S. Visser, 2011. "Severe language effect in university rankings: particularly Germany and France are wronged in citation-based rankings," Scientometrics, Springer;Akadémiai Kiadó, vol. 88(2), pages 495-498, August.
    27. Charles F. Manski, 2011. "Choosing Treatment Policies Under Ambiguity," Annual Review of Economics, Annual Reviews, vol. 3(1), pages 25-49, September.
    28. Per O. Seglen, 1992. "The skewness of science," Journal of the American Society for Information Science, Association for Information Science & Technology, vol. 43(9), pages 628-638, October.
    29. Colin Macilwain, 2013. "Halt the avalanche of performance metrics," Nature, Nature, vol. 500(7462), pages 255-255, August.
    30. Daniel A. Levinthal & James G. March, 1993. "The myopia of learning," Strategic Management Journal, Wiley Blackwell, vol. 14(S2), pages 95-112, December.
    31. Herbert A. Simon & Allen Newell, 1958. "Heuristic Problem Solving: The Next Advance in Operations Research," Operations Research, INFORMS, vol. 6(1), pages 1-10, February.
    32. Shabnam Mousavi & Gerd Gigerenzer, 2017. "Heuristics are Tools for Uncertainty," Homo Oeconomicus: Journal of Behavioral and Institutional Economics, Springer, vol. 34(4), pages 361-379, December.
    33. Lutz Bornmann & Julian N. Marewski, 2019. "Heuristics as conceptual lens for understanding and studying the usage of bibliometrics in research evaluation," Scientometrics, Springer;Akadémiai Kiadó, vol. 120(2), pages 419-459, August.
    34. Lutz Bornmann & Johann Bauer, 2015. "Which of the world's institutions employ the most highly cited researchers? An analysis of the data from highlycited.com," Journal of the Association for Information Science & Technology, Association for Information Science & Technology, vol. 66(10), pages 2146-2148, October.
    35. Duncan A Thomas & Maria Nedeva & Mayra M Tirado & Merle Jacob, 2020. "Changing research on research evaluation: A critical literature review to revisit the agenda," Research Evaluation, Oxford University Press, vol. 29(3), pages 275-288.
    36. Peter Weingart, 2005. "Impact of bibliometrics upon the science system: Inadvertent consequences?," Scientometrics, Springer;Akadémiai Kiadó, vol. 62(1), pages 117-131, January.
    37. Lutz Bornmann, 2013. "Research Misconduct—Definitions, Manifestations and Extent," Publications, MDPI, vol. 1(3), pages 1-12, October.
    38. Ajiferuke, Isola & Famoye, Felix, 2015. "Modelling count response variables in informetric studies: Comparison among count, linear, and lognormal regression models," Journal of Informetrics, Elsevier, vol. 9(3), pages 499-513.
    39. Ewen Callaway, 2016. "Beat it, impact factor! Publishing elite turns against controversial metric," Nature, Nature, vol. 535(7611), pages 210-211, July.
    Full references (including those not matched with items on IDEAS)

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Lutz Bornmann & Julian N. Marewski, 2019. "Heuristics as conceptual lens for understanding and studying the usage of bibliometrics in research evaluation," Scientometrics, Springer;Akadémiai Kiadó, vol. 120(2), pages 419-459, August.
    2. Bornmann, Lutz & Leydesdorff, Loet, 2017. "Skewness of citation impact data and covariates of citation distributions: A large-scale empirical analysis based on Web of Science data," Journal of Informetrics, Elsevier, vol. 11(1), pages 164-175.
    3. Waltman, Ludo, 2016. "A review of the literature on citation impact indicators," Journal of Informetrics, Elsevier, vol. 10(2), pages 365-391.
    4. Lutz Bornmann & Klaus Wohlrabe, 2019. "Normalisation of citation impact in economics," Scientometrics, Springer;Akadémiai Kiadó, vol. 120(2), pages 841-884, August.
    5. Yves Fassin, 2020. "The HF-rating as a universal complement to the h-index," Scientometrics, Springer;Akadémiai Kiadó, vol. 125(2), pages 965-990, November.
    6. Eugenio Petrovich, 2022. "Bibliometrics in Press. Representations and uses of bibliometric indicators in the Italian daily newspapers," Scientometrics, Springer;Akadémiai Kiadó, vol. 127(5), pages 2195-2233, May.
    7. Loet Leydesdorff & Lutz Bornmann & Jonathan Adams, 2019. "The integrated impact indicator revisited (I3*): a non-parametric alternative to the journal impact factor," Scientometrics, Springer;Akadémiai Kiadó, vol. 119(3), pages 1669-1694, June.
    8. Bornmann, Lutz & Ganser, Christian & Tekles, Alexander, 2022. "Simulation of the h index use at university departments within the bibliometrics-based heuristics framework: Can the indicator be used to compare individual researchers?," Journal of Informetrics, Elsevier, vol. 16(1).
    9. Dag W. Aksnes & Liv Langfeldt & Paul Wouters, 2019. "Citations, Citation Indicators, and Research Quality: An Overview of Basic Concepts and Theories," SAGE Open, , vol. 9(1), pages 21582440198, February.
    10. Raminta Pranckutė, 2021. "Web of Science (WoS) and Scopus: The Titans of Bibliographic Information in Today’s Academic World," Publications, MDPI, vol. 9(1), pages 1-59, March.
    11. Bornmann, Lutz & Marx, Werner, 2018. "Critical rationalism and the search for standard (field-normalized) indicators in bibliometrics," Journal of Informetrics, Elsevier, vol. 12(3), pages 598-604.
    12. Lutz Bornmann & Loet Leydesdorff, 2018. "Count highly-cited papers instead of papers with h citations: use normalized citation counts and compare “like with like”!," Scientometrics, Springer;Akadémiai Kiadó, vol. 115(2), pages 1119-1123, May.
    13. Schilling, Melissa A. & Green, Elad, 2011. "Recombinant search and breakthrough idea generation: An analysis of high impact papers in the social sciences," Research Policy, Elsevier, vol. 40(10), pages 1321-1331.
    14. Pedota, Mattia & Cicala, Francesco & Basti, Alessio, 2024. "A Wild Mind with a Disciplined Eye: Unleashing Human-GenAI Creativity Through Simulated Entity Elicitation," OSF Preprints 3bn95, Center for Open Science.
    15. Maziar Montazerian & Edgar Dutra Zanotto & Hellmut Eckert, 2019. "A new parameter for (normalized) evaluation of H-index: countries as a case study," Scientometrics, Springer;Akadémiai Kiadó, vol. 118(3), pages 1065-1078, March.
    16. Bart Leten & Rene Belderbos & Bart Van Looy, 2016. "Entry and Technological Performance in New Technology Domains: Technological Opportunities, Technology Competition and Technological Relatedness," Journal of Management Studies, Wiley Blackwell, vol. 53(8), pages 1257-1291, December.
    17. Yan Ling & Michelle Hammond & Li-Qun Wei, 2022. "Ethical leadership and ambidexterity in young firms: examining the CEO-TMT Interface," International Entrepreneurship and Management Journal, Springer, vol. 18(1), pages 25-48, March.
    18. Christina Fang & Daniel Levinthal, 2009. "Near-Term Liability of Exploitation: Exploration and Exploitation in Multistage Problems," Organization Science, INFORMS, vol. 20(3), pages 538-551, June.
    19. Sven Helmer & David B. Blumenthal & Kathrin Paschen, 2020. "What is meaningful research and how should we measure it?," Scientometrics, Springer;Akadémiai Kiadó, vol. 125(1), pages 153-169, October.
    20. Wildgaard, Lorna, 2016. "A critical cluster analysis of 44 indicators of author-level performance," Journal of Informetrics, Elsevier, vol. 10(4), pages 1055-1078.

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:spr:scient:v:129:y:2024:i:9:d:10.1007_s11192-024-05104-1. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: Sonal Shukla or Springer Nature Abstracting and Indexing (email available below). General contact details of provider: http://www.springer.com .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.