IDEAS home Printed from https://ideas.repec.org/a/eee/infome/v15y2021i3s1751157721000304.html
   My bibliography  Save this article

Convergent validity of several indicators measuring disruptiveness with milestone assignments to physics papers by experts

Author

Listed:
  • Bornmann, Lutz
  • Tekles, Alexander

Abstract

This study focuses on a recently introduced type of indicator measuring disruptiveness in science. Disruptive research diverges from current lines of research by opening up new lines. In the current study, we included the initially proposed indicator of this new type (Funk & Owen-Smith, 2017; Wu, Wang, & Evans, 2019) and several variants with DI1: DI5, DI1n, DI5n, and DEP. Since indicators should measure what they propose to measure, we investigated the convergent validity of the indicators. We used a list of milestone papers, selected and published by editors of Physical Review Letters, and investigated whether this human (experts)-based list is related to values of the several disruption indicators variants and – if so – which variants show the highest correlation with expert judgements. We used bivariate statistics, multiple regression models, and (coarsened) exact matching (CEM) to investigate the convergent validity of the indicators. The results show that the indicators correlate differently with the milestone paper assignments by the editors. It is not the initially proposed disruption index that performed best (DI1), but the variant DI5 which has been introduced by Bornmann, Devarakonda, Tekles, and Chacko (2020a). In the CEM analysis of this study, the DEP variant – introduced by Bu, Waltman, and Huang (in press) – also showed favorable results.

Suggested Citation

  • Bornmann, Lutz & Tekles, Alexander, 2021. "Convergent validity of several indicators measuring disruptiveness with milestone assignments to physics papers by experts," Journal of Informetrics, Elsevier, vol. 15(3).
  • Handle: RePEc:eee:infome:v:15:y:2021:i:3:s1751157721000304
    DOI: 10.1016/j.joi.2021.101159
    as

    Download full text from publisher

    File URL: http://www.sciencedirect.com/science/article/pii/S1751157721000304
    Download Restriction: Full text for ScienceDirect subscribers only

    File URL: https://libkey.io/10.1016/j.joi.2021.101159?utm_source=ideas
    LibKey link: if access is restricted and if your library uses this service, LibKey will redirect you to where you can use your library subscription to access this item
    ---><---

    As the access to this document is restricted, you may want to search for a different version of it.

    References listed on IDEAS

    as
    1. Lingfei Wu & Dashun Wang & James A. Evans, 2019. "Large teams develop and small teams disrupt science and technology," Nature, Nature, vol. 566(7744), pages 378-382, February.
    2. H. P. F. Peters & A. F. J. van Raan, 1994. "On determinants of citation scores: A case study in chemical engineering," Journal of the American Society for Information Science, Association for Information Science & Technology, vol. 45(1), pages 39-49, January.
    3. Lutz Bornmann & Hans-Dieter Daniel, 2009. "Reviewer and editor biases in journal peer review: an investigation of manuscript refereeing at Angewandte Chemie International Edition," Research Evaluation, Oxford University Press, vol. 18(4), pages 262-272, October.
    4. Onodera, Natsuo, 2016. "Properties of an index of citation durability of an article," Journal of Informetrics, Elsevier, vol. 10(4), pages 981-1004.
    5. Leydesdorff, Loet & Wagner, Caroline S. & Bornmann, Lutz, 2014. "The European Union, China, and the United States in the top-1% and top-10% layers of most-frequently cited publications: Competition and collaborations," Journal of Informetrics, Elsevier, vol. 8(3), pages 606-617.
    6. Lutz Bornmann & Sitaram Devarakonda & Alexander Tekles & George Chacko, 2020. "Disruptive papers published in Scientometrics: meaningful results by using an improved variant of the disruption index originally proposed by Wu, Wang, and Evans (2019)," Scientometrics, Springer;Akadémiai Kiadó, vol. 123(2), pages 1149-1155, May.
    7. Fereshteh Didegah & Mike Thelwall, 2013. "Determinants of research citation impact in nanoscience and nanotechnology," Journal of the Association for Information Science & Technology, Association for Information Science & Technology, vol. 64(5), pages 1055-1064, May.
    8. Bornmann, Lutz & Tekles, Alexander & Zhang, Helena H. & Ye, Fred Y., 2019. "Do we measure novelty when we analyze unusual combinations of cited references? A validation study of bibliometric novelty indicators based on F1000Prime data," Journal of Informetrics, Elsevier, vol. 13(4).
    9. Mariani, Manuel Sebastian & Medo, Matúš & Zhang, Yi-Cheng, 2016. "Identification of milestone papers through time-balanced network centrality," Journal of Informetrics, Elsevier, vol. 10(4), pages 1207-1223.
    10. Yves Gingras & Mahdi Khelfaoui, 2018. "Assessing the effect of the United States’ “citation advantage” on other countries’ scientific impact as measured in the Web of Science (WoS) database," Scientometrics, Springer;Akadémiai Kiadó, vol. 114(2), pages 517-532, February.
    11. Marco Caliendo & Sabine Kopeinig, 2008. "Some Practical Guidance For The Implementation Of Propensity Score Matching," Journal of Economic Surveys, Wiley Blackwell, vol. 22(1), pages 31-72, February.
    12. Wagner, Caroline S. & Whetsell, Travis A. & Mukherjee, Satyam, 2019. "International research collaboration: Novelty, conventionality, and atypicality in knowledge recombination," Research Policy, Elsevier, vol. 48(5), pages 1260-1270.
    13. Ahlgren, Per & Waltman, Ludo, 2014. "The correlation between citation-based and expert-based assessments of publication channels: SNIP and SJR vs. Norwegian quality assessments," Journal of Informetrics, Elsevier, vol. 8(4), pages 985-996.
    14. Fok, Dennis & Franses, Philip Hans, 2007. "Modeling the diffusion of scientific publications," Journal of Econometrics, Elsevier, vol. 139(2), pages 376-390, August.
    15. Michael N. Mitchell, 2012. "Interpreting and Visualizing Regression Models Using Stata," Stata Press books, StataCorp LP, number ivrm, March.
    16. Schilling, Melissa A. & Green, Elad, 2011. "Recombinant search and breakthrough idea generation: An analysis of high impact papers in the social sciences," Research Policy, Elsevier, vol. 40(10), pages 1321-1331.
    17. Lee, You-Na & Walsh, John P. & Wang, Jian, 2015. "Creativity in scientific teams: Unpacking novelty and impact," Research Policy, Elsevier, vol. 44(3), pages 684-697.
    18. Dag W. Aksnes & Liv Langfeldt & Paul Wouters, 2019. "Citations, Citation Indicators, and Research Quality: An Overview of Basic Concepts and Theories," SAGE Open, , vol. 9(1), pages 21582440198, February.
    19. Russell J. Funk & Jason Owen-Smith, 2017. "A Dynamic Network Measure of Technological Change," Management Science, INFORMS, vol. 63(3), pages 791-817, March.
    20. A. Colin Cameron & Douglas L. Miller, 2015. "A Practitioner’s Guide to Cluster-Robust Inference," Journal of Human Resources, University of Wisconsin Press, vol. 50(2), pages 317-372.
    21. Elizabeth S. Vieira & José A.N.F. Gomes, 2016. "The bibliometric indicators as predictors of the final decision of the peer review," Research Evaluation, Oxford University Press, vol. 25(2), pages 170-183.
    22. Ben Jann, 2017. "Why propensity scores should be used for matching," German Stata Users' Group Meetings 2017 01, Stata Users Group.
    23. Fereshteh Didegah & Mike Thelwall, 2013. "Determinants of research citation impact in nanoscience and nanotechnology," Journal of the American Society for Information Science and Technology, Association for Information Science & Technology, vol. 64(5), pages 1055-1064, May.
    24. Wang, Jian & Lee, You-Na & Walsh, John P., 2018. "Funding model and creativity in science: Competitive versus block funding and status contingency effects," Research Policy, Elsevier, vol. 47(6), pages 1070-1083.
    25. Loet Leydesdorff & Paul Wouters & Lutz Bornmann, 2016. "Professional and citizen bibliometrics: complementarities and ambivalences in the development and use of indicators—a state-of-the-art report," Scientometrics, Springer;Akadémiai Kiadó, vol. 109(3), pages 2129-2150, December.
    26. Matthew E Falagas & Angeliki Zarkali & Drosos E Karageorgopoulos & Vangelis Bardakas & Michael N Mavros, 2013. "The Impact of Article Length on the Number of Future Citations: A Bibliometric Analysis of General Medicine Journals," PLOS ONE, Public Library of Science, vol. 8(2), pages 1-8, February.
    27. Lundberg, Jonas, 2007. "Lifting the crown—citation z-score," Journal of Informetrics, Elsevier, vol. 1(2), pages 145-154.
    28. Donald deB. Beaver, 2004. "Does collaborative research have greater epistemic authority?," Scientometrics, Springer;Akadémiai Kiadó, vol. 60(3), pages 399-408, August.
    29. Breusch, T S & Pagan, A R, 1979. "A Simple Test for Heteroscedasticity and Random Coefficient Variation," Econometrica, Econometric Society, vol. 47(5), pages 1287-1294, September.
    30. Tahamtan, Iman & Bornmann, Lutz, 2018. "Creativity in science and the link to cited references: Is the creative potential of papers reflected in their cited references?," Journal of Informetrics, Elsevier, vol. 12(3), pages 906-930.
    31. Lutz Bornmann & Alexander Tekles, 2019. "Disruptive papers published in Scientometrics," Scientometrics, Springer;Akadémiai Kiadó, vol. 120(1), pages 331-336, July.
    32. Tahamtan, Iman & Bornmann, Lutz, 2018. "Core elements in the process of citing publications: Conceptual overview of the literature," Journal of Informetrics, Elsevier, vol. 12(1), pages 203-216.
    33. Per O. Seglen, 1992. "The skewness of science," Journal of the American Society for Information Science, Association for Information Science & Technology, vol. 43(9), pages 628-638, October.
    34. Alberto Baccini & Giuseppe De Nicolao, 2016. "Do they agree? Bibliometric evaluation versus informed peer review in the Italian research assessment exercise," Scientometrics, Springer;Akadémiai Kiadó, vol. 108(3), pages 1651-1671, September.
    35. Iacus, Stefano M. & King, Gary & Porro, Giuseppe, 2012. "Causal Inference without Balance Checking: Coarsened Exact Matching," Political Analysis, Cambridge University Press, vol. 20(1), pages 1-24, January.
    36. Iman Tahamtan & Askar Safipour Afshar & Khadijeh Ahamdzadeh, 2016. "Factors affecting number of citations: a comprehensive review of the literature," Scientometrics, Springer;Akadémiai Kiadó, vol. 107(3), pages 1195-1225, June.
    37. Maarten Wesel & Sally Wyatt & Jeroen Haaf, 2014. "What a difference a colon makes: how superficial factors influence subsequent citation," Scientometrics, Springer;Akadémiai Kiadó, vol. 98(3), pages 1601-1615, March.
    38. Haddawy, Peter & Hassan, Saeed-Ul & Asghar, Awais & Amin, Sarah, 2016. "A comprehensive examination of the relation of three citation-based journal metrics to expert judgment of journal quality," Journal of Informetrics, Elsevier, vol. 10(1), pages 162-173.
    Full references (including those not matched with items on IDEAS)

    Citations

    Citations are extracted by the CitEc Project, subscribe to its RSS feed for this item.
    as


    Cited by:

    1. Shiyun Wang & Yaxue Ma & Jin Mao & Yun Bai & Zhentao Liang & Gang Li, 2023. "Quantifying scientific breakthroughs by a novel disruption indicator based on knowledge entities," Journal of the Association for Information Science & Technology, Association for Information Science & Technology, vol. 74(2), pages 150-167, February.
    2. Tong, Tong & Wang, Wanru & Ye, Fred Y., 2024. "A complement to the novel disruption indicator based on knowledge entities," Journal of Informetrics, Elsevier, vol. 18(2).
    3. Hou, Jianhua & Wang, Dongyi & Li, Jing, 2022. "A new method for measuring the originality of academic articles based on knowledge units in semantic networks," Journal of Informetrics, Elsevier, vol. 16(3).
    4. Peter Sjögårde & Fereshteh Didegah, 2022. "The association between topic growth and citation impact of research publications," Scientometrics, Springer;Akadémiai Kiadó, vol. 127(4), pages 1903-1921, April.
    5. Yuyan Jiang & Xueli Liu, 2023. "A construction and empirical research of the journal disruption index based on open citation data," Scientometrics, Springer;Akadémiai Kiadó, vol. 128(7), pages 3935-3958, July.
    6. Zhentao Liang & Jin Mao & Gang Li, 2023. "Bias against scientific novelty: A prepublication perspective," Journal of the Association for Information Science & Technology, Association for Information Science & Technology, vol. 74(1), pages 99-114, January.
    7. Yue Wang & Ning Li & Bin Zhang & Qian Huang & Jian Wu & Yang Wang, 2023. "The effect of structural holes on producing novel and disruptive research in physics," Scientometrics, Springer;Akadémiai Kiadó, vol. 128(3), pages 1801-1823, March.
    8. Leydesdorff, Loet & Bornmann, Lutz, 2021. "Disruption indices and their calculation using web-of-science data: Indicators of historical developments or evolutionary dynamics?," Journal of Informetrics, Elsevier, vol. 15(4).
    9. Yang, Alex Jie & Wu, Linwei & Zhang, Qi & Wang, Hao & Deng, Sanhong, 2023. "The k-step h-index in citation networks at the paper, author, and institution levels," Journal of Informetrics, Elsevier, vol. 17(4).

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Bornmann, Lutz & Haunschild, Robin & Mutz, Rüdiger, 2020. "Should citations be field-normalized in evaluative bibliometrics? An empirical analysis based on propensity score matching," Journal of Informetrics, Elsevier, vol. 14(4).
    2. Libo Sheng & Dongqing Lyu & Xuanmin Ruan & Hongquan Shen & Ying Cheng, 2023. "The association between prior knowledge and the disruption of an article," Scientometrics, Springer;Akadémiai Kiadó, vol. 128(8), pages 4731-4751, August.
    3. Dongqing Lyu & Kaile Gong & Xuanmin Ruan & Ying Cheng & Jiang Li, 2021. "Does research collaboration influence the “disruption” of articles? Evidence from neurosciences," Scientometrics, Springer;Akadémiai Kiadó, vol. 126(1), pages 287-303, January.
    4. Bornmann, Lutz, 2019. "Does the normalized citation impact of universities profit from certain properties of their published documents – such as the number of authors and the impact factor of the publishing journals? A mult," Journal of Informetrics, Elsevier, vol. 13(1), pages 170-184.
    5. António Osório & Lutz Bornmann, 2021. "On the disruptive power of small-teams research," Scientometrics, Springer;Akadémiai Kiadó, vol. 126(1), pages 117-133, January.
    6. Lutz Bornmann & Adam Y. Ye & Fred Y. Ye, 2018. "Identifying “hot papers” and papers with “delayed recognition” in large-scale datasets by using dynamically normalized citation impact scores," Scientometrics, Springer;Akadémiai Kiadó, vol. 116(2), pages 655-674, August.
    7. Bornmann, Lutz & Leydesdorff, Loet, 2017. "Skewness of citation impact data and covariates of citation distributions: A large-scale empirical analysis based on Web of Science data," Journal of Informetrics, Elsevier, vol. 11(1), pages 164-175.
    8. Martorell Cunil, Onofre & Otero González, Luis & Durán Santomil, Pablo & Mulet Forteza, Carlos, 2023. "How to accomplish a highly cited paper in the tourism, leisure and hospitality field," Journal of Business Research, Elsevier, vol. 157(C).
    9. Fan, Lingxu & Guo, Lei & Wang, Xinhua & Xu, Liancheng & Liu, Fangai, 2022. "Does the author’s collaboration mode lead to papers’ different citation impacts? An empirical analysis based on propensity score matching," Journal of Informetrics, Elsevier, vol. 16(4).
    10. Raminta Pranckutė, 2021. "Web of Science (WoS) and Scopus: The Titans of Bibliographic Information in Today’s Academic World," Publications, MDPI, vol. 9(1), pages 1-59, March.
    11. Mingyang Wang & Zhenyu Wang & Guangsheng Chen, 2019. "Which can better predict the future success of articles? Bibliometric indices or alternative metrics," Scientometrics, Springer;Akadémiai Kiadó, vol. 119(3), pages 1575-1595, June.
    12. Leydesdorff, Loet & Bornmann, Lutz, 2021. "Disruption indices and their calculation using web-of-science data: Indicators of historical developments or evolutionary dynamics?," Journal of Informetrics, Elsevier, vol. 15(4).
    13. Liu, Meijun & Jaiswal, Ajay & Bu, Yi & Min, Chao & Yang, Sijie & Liu, Zhibo & Acuña, Daniel & Ding, Ying, 2022. "Team formation and team impact: The balance between team freshness and repeat collaboration," Journal of Informetrics, Elsevier, vol. 16(4).
    14. Yuefen Wang & Lipeng Fan & Lei Wu, 2024. "A validation test of the Uzzi et al. novelty measure of innovation and applications to collaboration patterns between institutions," Scientometrics, Springer;Akadémiai Kiadó, vol. 129(7), pages 4379-4394, July.
    15. Ruan, Xuanmin & Lyu, Dongqing & Gong, Kaile & Cheng, Ying & Li, Jiang, 2021. "Rethinking the disruption index as a measure of scientific and technological advances," Technological Forecasting and Social Change, Elsevier, vol. 172(C).
    16. Ruijie Wang & Yuhao Zhou & An Zeng, 2023. "Evaluating scientists by citation and disruption of their representative works," Scientometrics, Springer;Akadémiai Kiadó, vol. 128(3), pages 1689-1710, March.
    17. Elizabeth S. Vieira, 2023. "The influence of research collaboration on citation impact: the countries in the European Innovation Scoreboard," Scientometrics, Springer;Akadémiai Kiadó, vol. 128(6), pages 3555-3579, June.
    18. Yang, Alex J., 2024. "Unveiling the impact and dual innovation of funded research," Journal of Informetrics, Elsevier, vol. 18(1).
    19. Zhentao Liang & Jin Mao & Gang Li, 2023. "Bias against scientific novelty: A prepublication perspective," Journal of the Association for Information Science & Technology, Association for Information Science & Technology, vol. 74(1), pages 99-114, January.
    20. Yan Yan & Shanwu Tian & Jingjing Zhang, 2020. "The impact of a paper’s new combinations and new components on its citation," Scientometrics, Springer;Akadémiai Kiadó, vol. 122(2), pages 895-913, February.

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:eee:infome:v:15:y:2021:i:3:s1751157721000304. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: Catherine Liu (email available below). General contact details of provider: http://www.elsevier.com/locate/joi .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.