IDEAS home Printed from https://ideas.repec.org/a/spr/scient/v127y2022i10d10.1007_s11192-022-04499-z.html
   My bibliography  Save this article

Revisiting the disruptive index: evidence from the Nobel Prize-winning articles

Author

Listed:
  • Guoqiang Liang

    (Beijing University of Technology)

  • Ying Lou

    (Beijing University of Technology)

  • Haiyan Hou

    (Dalian University of Technology)

Abstract

In the last two decades, scholars have designed various types of bibliographic-related indicators to identify breakthrough-class academic achievements. In this study, we take a step further to look at the performance of the promising disruptive index (DI) in reference (Wu et al. in Nature 566(7744):378-382, https://doi.org/10.1038/s41586-019-0941-9 , 2019), thus deepening our understanding of the DI and further facilitating its wise use in bibliometrics. Using publication records for Nobel laureates between 1900 and 2016, we calculate the DI of Nobel Prize-winning articles and benchmark articles from each year, use the median and mean DI to denote the central tendency in each year, and analyze the variation of the DI since publication. We find that Nobel Prize-winning articles are not necessarily more disruptive than benchmark articles. Results based on DI depend on the length of their citation time window, and different citation time windows may cause different, even controversial, results. As a result, research assessment should balance between short- & long-term scientific impact; Also, discipline and time play a role in the length of the citation window when using DI to measure the innovativeness of scientific work. The study also discusses potential research directions around DI.

Suggested Citation

  • Guoqiang Liang & Ying Lou & Haiyan Hou, 2022. "Revisiting the disruptive index: evidence from the Nobel Prize-winning articles," Scientometrics, Springer;Akadémiai Kiadó, vol. 127(10), pages 5721-5730, October.
  • Handle: RePEc:spr:scient:v:127:y:2022:i:10:d:10.1007_s11192-022-04499-z
    DOI: 10.1007/s11192-022-04499-z
    as

    Download full text from publisher

    File URL: http://link.springer.com/10.1007/s11192-022-04499-z
    File Function: Abstract
    Download Restriction: Access to the full text of the articles in this series is restricted.

    File URL: https://libkey.io/10.1007/s11192-022-04499-z?utm_source=ideas
    LibKey link: if access is restricted and if your library uses this service, LibKey will redirect you to where you can use your library subscription to access this item
    ---><---

    As the access to this document is restricted, you may want to search for a different version of it.

    References listed on IDEAS

    as
    1. Lingfei Wu & Dashun Wang & James A. Evans, 2019. "Large teams develop and small teams disrupt science and technology," Nature, Nature, vol. 566(7744), pages 378-382, February.
    2. Wang, Jian & Veugelers, Reinhilde & Stephan, Paula, 2017. "Bias against novelty in science: A cautionary tale for users of bibliometric indicators," Research Policy, Elsevier, vol. 46(8), pages 1416-1436.
    3. Bornmann, Lutz & Tekles, Alexander & Zhang, Helena H. & Ye, Fred Y., 2019. "Do we measure novelty when we analyze unusual combinations of cited references? A validation study of bibliometric novelty indicators based on F1000Prime data," Journal of Informetrics, Elsevier, vol. 13(4).
    4. Purkayastha, Amrita & Palmaro, Eleonora & Falk-Krzesinski, Holly J. & Baas, Jeroen, 2019. "Comparison of two article-level, field-independent citation metrics: Field-Weighted Citation Impact (FWCI) and Relative Citation Ratio (RCR)," Journal of Informetrics, Elsevier, vol. 13(2), pages 635-642.
    5. Guo, Xiaolong & Li, Xiaoxiao & Yu, Yugang, 2021. "Publication delay adjusted impact factor: The effect of publication delay of articles on journal impact factor," Journal of Informetrics, Elsevier, vol. 15(1).
    6. Xi Zhang & Xianhai Wang & Hongke Zhao & Patricia Ordóñez de Pablos & Yongqiang Sun & Hui Xiong, 2019. "An effectiveness analysis of altmetrics indices for different levels of artificial intelligence publications," Scientometrics, Springer;Akadémiai Kiadó, vol. 119(3), pages 1311-1344, June.
    7. Russell J. Funk & Jason Owen-Smith, 2017. "A Dynamic Network Measure of Technological Change," Management Science, INFORMS, vol. 63(3), pages 791-817, March.
    8. Liang, Guoqiang & Hou, Haiyan & Ding, Ying & Hu, Zhigang, 2020. "Knowledge recency to the birth of Nobel Prize-winning articles: Gender, career stage, and country," Journal of Informetrics, Elsevier, vol. 14(3).
    Full references (including those not matched with items on IDEAS)

    Citations

    Citations are extracted by the CitEc Project, subscribe to its RSS feed for this item.
    as


    Cited by:

    1. Zhang, Ming-Ze & Wang, Tang-Rong & Lyu, Peng-Hui & Chen, Qi-Mei & Li, Ze-Xia & Ngai, Eric W.T., 2024. "Impact of gender composition of academic teams on disruptive output," Journal of Informetrics, Elsevier, vol. 18(2).
    2. Adrian Furnham, 2023. "Peer nominations as scientometrics," Scientometrics, Springer;Akadémiai Kiadó, vol. 128(2), pages 1451-1458, February.
    3. Ao, Weiyi & Lyu, Dongqing & Ruan, Xuanmin & Li, Jiang & Cheng, Ying, 2023. "Scientific creativity patterns in scholars’ academic careers: Evidence from PubMed," Journal of Informetrics, Elsevier, vol. 17(4).
    4. Yuyan Jiang & Xueli Liu, 2023. "A construction and empirical research of the journal disruption index based on open citation data," Scientometrics, Springer;Akadémiai Kiadó, vol. 128(7), pages 3935-3958, July.

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Zhentao Liang & Jin Mao & Gang Li, 2023. "Bias against scientific novelty: A prepublication perspective," Journal of the Association for Information Science & Technology, Association for Information Science & Technology, vol. 74(1), pages 99-114, January.
    2. Hou, Jianhua & Wang, Dongyi & Li, Jing, 2022. "A new method for measuring the originality of academic articles based on knowledge units in semantic networks," Journal of Informetrics, Elsevier, vol. 16(3).
    3. Yuefen Wang & Lipeng Fan & Lei Wu, 2024. "A validation test of the Uzzi et al. novelty measure of innovation and applications to collaboration patterns between institutions," Scientometrics, Springer;Akadémiai Kiadó, vol. 129(7), pages 4379-4394, July.
    4. Yang, Alex J., 2024. "Unveiling the impact and dual innovation of funded research," Journal of Informetrics, Elsevier, vol. 18(1).
    5. Libo Sheng & Dongqing Lyu & Xuanmin Ruan & Hongquan Shen & Ying Cheng, 2023. "The association between prior knowledge and the disruption of an article," Scientometrics, Springer;Akadémiai Kiadó, vol. 128(8), pages 4731-4751, August.
    6. Wu, Lingfei & Kittur, Aniket & Youn, Hyejin & Milojević, Staša & Leahey, Erin & Fiore, Stephen M. & Ahn, Yong-Yeol, 2022. "Metrics and mechanisms: Measuring the unmeasurable in the science of science," Journal of Informetrics, Elsevier, vol. 16(2).
    7. Sam Arts & Nicola Melluso & Reinhilde Veugelers, 2023. "Beyond Citations: Measuring Novel Scientific Ideas and their Impact in Publication Text," Papers 2309.16437, arXiv.org, revised Oct 2024.
    8. Yue Wang & Ning Li & Bin Zhang & Qian Huang & Jian Wu & Yang Wang, 2023. "The effect of structural holes on producing novel and disruptive research in physics," Scientometrics, Springer;Akadémiai Kiadó, vol. 128(3), pages 1801-1823, March.
    9. Wang, Cheng-Jun & Yan, Lihan & Cui, Haochuan, 2023. "Unpacking the essential tension of knowledge recombination: Analyzing the impact of knowledge spanning on citation impact and disruptive innovation," Journal of Informetrics, Elsevier, vol. 17(4).
    10. Dongqing Lyu & Kaile Gong & Xuanmin Ruan & Ying Cheng & Jiang Li, 2021. "Does research collaboration influence the “disruption” of articles? Evidence from neurosciences," Scientometrics, Springer;Akadémiai Kiadó, vol. 126(1), pages 287-303, January.
    11. Shiyun Wang & Yaxue Ma & Jin Mao & Yun Bai & Zhentao Liang & Gang Li, 2023. "Quantifying scientific breakthroughs by a novel disruption indicator based on knowledge entities," Journal of the Association for Information Science & Technology, Association for Information Science & Technology, vol. 74(2), pages 150-167, February.
    12. Min, Chao & Bu, Yi & Sun, Jianjun, 2021. "Predicting scientific breakthroughs based on knowledge structure variations," Technological Forecasting and Social Change, Elsevier, vol. 164(C).
    13. Yulin Yu & Daniel M. Romero, 2024. "Does the Use of Unusual Combinations of Datasets Contribute to Greater Scientific Impact?," Papers 2402.05024, arXiv.org, revised Sep 2024.
    14. Bornmann, Lutz & Tekles, Alexander, 2021. "Convergent validity of several indicators measuring disruptiveness with milestone assignments to physics papers by experts," Journal of Informetrics, Elsevier, vol. 15(3).
    15. Zhongyi Wang & Keying Wang & Jiyue Liu & Jing Huang & Haihua Chen, 2022. "Measuring the innovation of method knowledge elements in scientific literature," Scientometrics, Springer;Akadémiai Kiadó, vol. 127(5), pages 2803-2827, May.
    16. Ruan, Xuanmin & Lyu, Dongqing & Gong, Kaile & Cheng, Ying & Li, Jiang, 2021. "Rethinking the disruption index as a measure of scientific and technological advances," Technological Forecasting and Social Change, Elsevier, vol. 172(C).
    17. Wenjie Wei & Hongxu Liu & Zhuanlan Sun, 2022. "Cover papers of top journals are reliable source for emerging topics detection: a machine learning based prediction framework," Scientometrics, Springer;Akadémiai Kiadó, vol. 127(8), pages 4315-4333, August.
    18. Seolmin Yang & So Young Kim, 2023. "Knowledge-integrated research is more disruptive when supported by homogeneous funding sources: a case of US federally funded research in biomedical and life sciences," Scientometrics, Springer;Akadémiai Kiadó, vol. 128(6), pages 3257-3282, June.
    19. Lu Liu & Benjamin F. Jones & Brian Uzzi & Dashun Wang, 2023. "Data, measurement and empirical methods in the science of science," Nature Human Behaviour, Nature, vol. 7(7), pages 1046-1058, July.
    20. António Osório & Lutz Bornmann, 2021. "On the disruptive power of small-teams research," Scientometrics, Springer;Akadémiai Kiadó, vol. 126(1), pages 117-133, January.

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:spr:scient:v:127:y:2022:i:10:d:10.1007_s11192-022-04499-z. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: Sonal Shukla or Springer Nature Abstracting and Indexing (email available below). General contact details of provider: http://www.springer.com .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.