IDEAS home Printed from https://ideas.repec.org/a/spr/scient/v128y2023i4d10.1007_s11192-023-04644-2.html
   My bibliography  Save this article

Enhancing the robustness of the disruption metric against noise

Author

Listed:
  • Nan Deng

    (Beijing Normal University)

  • An Zeng

    (Beijing Normal University)

Abstract

Measuring the novelty of scientific papers is an important research topic. If most subsequent researches of a focal paper only cite itself instead of citing its references as well, this paper could be highly disruptive as it may start a new stream of research. However, due to preferential attachment, even if a focal paper is very disruptive, the subsequent works may still cite both the focal paper and some of its highly cited references. To eliminate the noise caused by these highly cited references, we modify the disruption metric and analyze its performance and robustness. The results show that the improved method could better distinguish Nobel prize winning papers from the others. In addition, the resultant ranking is more stable against the highly cited references and random link removal on citation network.

Suggested Citation

  • Nan Deng & An Zeng, 2023. "Enhancing the robustness of the disruption metric against noise," Scientometrics, Springer;Akadémiai Kiadó, vol. 128(4), pages 2419-2428, April.
  • Handle: RePEc:spr:scient:v:128:y:2023:i:4:d:10.1007_s11192-023-04644-2
    DOI: 10.1007/s11192-023-04644-2
    as

    Download full text from publisher

    File URL: http://link.springer.com/10.1007/s11192-023-04644-2
    File Function: Abstract
    Download Restriction: Access to the full text of the articles in this series is restricted.

    File URL: https://libkey.io/10.1007/s11192-023-04644-2?utm_source=ideas
    LibKey link: if access is restricted and if your library uses this service, LibKey will redirect you to where you can use your library subscription to access this item
    ---><---

    As the access to this document is restricted, you may want to search for a different version of it.

    References listed on IDEAS

    as
    1. Lingfei Wu & Dashun Wang & James A. Evans, 2019. "Large teams develop and small teams disrupt science and technology," Nature, Nature, vol. 566(7744), pages 378-382, February.
    2. John P. A. Ioannidis & Kevin W. Boyack & Henry Small & Aaron A. Sorensen & Richard Klavans, 2014. "Bibliometrics: Is your most cited work your best?," Nature, Nature, vol. 514(7524), pages 561-562, October.
    3. Lutz Bornmann & Sitaram Devarakonda & Alexander Tekles & George Chacko, 2020. "Disruptive papers published in Scientometrics: meaningful results by using an improved variant of the disruption index originally proposed by Wu, Wang, and Evans (2019)," Scientometrics, Springer;Akadémiai Kiadó, vol. 123(2), pages 1149-1155, May.
    4. Lindell Bromham & Russell Dinnage & Xia Hua, 2016. "Interdisciplinary research has consistently lower funding success," Nature, Nature, vol. 534(7609), pages 684-687, June.
    5. Russell J. Funk & Jason Owen-Smith, 2017. "A Dynamic Network Measure of Technological Change," Management Science, INFORMS, vol. 63(3), pages 791-817, March.
    6. Tahamtan, Iman & Bornmann, Lutz, 2018. "Creativity in science and the link to cited references: Is the creative potential of papers reflected in their cited references?," Journal of Informetrics, Elsevier, vol. 12(3), pages 906-930.
    7. Lutz Bornmann & Alexander Tekles, 2019. "Disruptive papers published in Scientometrics," Scientometrics, Springer;Akadémiai Kiadó, vol. 120(1), pages 331-336, July.
    Full references (including those not matched with items on IDEAS)

    Citations

    Citations are extracted by the CitEc Project, subscribe to its RSS feed for this item.
    as


    Cited by:

    1. Zhang, Ming-Ze & Wang, Tang-Rong & Lyu, Peng-Hui & Chen, Qi-Mei & Li, Ze-Xia & Ngai, Eric W.T., 2024. "Impact of gender composition of academic teams on disruptive output," Journal of Informetrics, Elsevier, vol. 18(2).

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Leydesdorff, Loet & Bornmann, Lutz, 2021. "Disruption indices and their calculation using web-of-science data: Indicators of historical developments or evolutionary dynamics?," Journal of Informetrics, Elsevier, vol. 15(4).
    2. Bornmann, Lutz & Tekles, Alexander, 2021. "Convergent validity of several indicators measuring disruptiveness with milestone assignments to physics papers by experts," Journal of Informetrics, Elsevier, vol. 15(3).
    3. Ruijie Wang & Yuhao Zhou & An Zeng, 2023. "Evaluating scientists by citation and disruption of their representative works," Scientometrics, Springer;Akadémiai Kiadó, vol. 128(3), pages 1689-1710, March.
    4. Dongqing Lyu & Kaile Gong & Xuanmin Ruan & Ying Cheng & Jiang Li, 2021. "Does research collaboration influence the “disruption” of articles? Evidence from neurosciences," Scientometrics, Springer;Akadémiai Kiadó, vol. 126(1), pages 287-303, January.
    5. Ruan, Xuanmin & Lyu, Dongqing & Gong, Kaile & Cheng, Ying & Li, Jiang, 2021. "Rethinking the disruption index as a measure of scientific and technological advances," Technological Forecasting and Social Change, Elsevier, vol. 172(C).
    6. António Osório & Lutz Bornmann, 2021. "On the disruptive power of small-teams research," Scientometrics, Springer;Akadémiai Kiadó, vol. 126(1), pages 117-133, January.
    7. Libo Sheng & Dongqing Lyu & Xuanmin Ruan & Hongquan Shen & Ying Cheng, 2023. "The association between prior knowledge and the disruption of an article," Scientometrics, Springer;Akadémiai Kiadó, vol. 128(8), pages 4731-4751, August.
    8. Peter Sjögårde & Fereshteh Didegah, 2022. "The association between topic growth and citation impact of research publications," Scientometrics, Springer;Akadémiai Kiadó, vol. 127(4), pages 1903-1921, April.
    9. Shuo Xu & Liyuan Hao & Xin An & Hongshen Pang & Ting Li, 2020. "Review on emerging research topics with key-route main path analysis," Scientometrics, Springer;Akadémiai Kiadó, vol. 122(1), pages 607-624, January.
    10. K. Brad Wray & Søren R. Paludan & Lutz Bornmann & Robin Haunschild, 2024. "Using Reference Publication Year Spectroscopy (RPYS) to analyze the research and publication culture in immunology," Scientometrics, Springer;Akadémiai Kiadó, vol. 129(6), pages 3271-3283, June.
    11. Zhentao Liang & Jin Mao & Gang Li, 2023. "Bias against scientific novelty: A prepublication perspective," Journal of the Association for Information Science & Technology, Association for Information Science & Technology, vol. 74(1), pages 99-114, January.
    12. Yue Wang & Ning Li & Bin Zhang & Qian Huang & Jian Wu & Yang Wang, 2023. "The effect of structural holes on producing novel and disruptive research in physics," Scientometrics, Springer;Akadémiai Kiadó, vol. 128(3), pages 1801-1823, March.
    13. Lin, Yiling & Evans, James A. & Wu, Lingfei, 2022. "New directions in science emerge from disconnection and discord," Journal of Informetrics, Elsevier, vol. 16(1).
    14. Zhang, Ming-Ze & Wang, Tang-Rong & Lyu, Peng-Hui & Chen, Qi-Mei & Li, Ze-Xia & Ngai, Eric W.T., 2024. "Impact of gender composition of academic teams on disruptive output," Journal of Informetrics, Elsevier, vol. 18(2).
    15. Jeffrey T. Macher & Christian Rutzer & Rolf Weder, 2023. "The Illusive Slump of Disruptive Patents," Papers 2306.10774, arXiv.org.
    16. Yuyan Jiang & Xueli Liu, 2023. "A construction and empirical research of the journal disruption index based on open citation data," Scientometrics, Springer;Akadémiai Kiadó, vol. 128(7), pages 3935-3958, July.
    17. Hou, Jianhua & Wang, Dongyi & Li, Jing, 2022. "A new method for measuring the originality of academic articles based on knowledge units in semantic networks," Journal of Informetrics, Elsevier, vol. 16(3).
    18. Seolmin Yang & So Young Kim, 2023. "Knowledge-integrated research is more disruptive when supported by homogeneous funding sources: a case of US federally funded research in biomedical and life sciences," Scientometrics, Springer;Akadémiai Kiadó, vol. 128(6), pages 3257-3282, June.
    19. Ao, Weiyi & Lyu, Dongqing & Ruan, Xuanmin & Li, Jiang & Cheng, Ying, 2023. "Scientific creativity patterns in scholars’ academic careers: Evidence from PubMed," Journal of Informetrics, Elsevier, vol. 17(4).
    20. Lutz Bornmann & Sitaram Devarakonda & Alexander Tekles & George Chacko, 2020. "Disruptive papers published in Scientometrics: meaningful results by using an improved variant of the disruption index originally proposed by Wu, Wang, and Evans (2019)," Scientometrics, Springer;Akadémiai Kiadó, vol. 123(2), pages 1149-1155, May.

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:spr:scient:v:128:y:2023:i:4:d:10.1007_s11192-023-04644-2. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: Sonal Shukla or Springer Nature Abstracting and Indexing (email available below). General contact details of provider: http://www.springer.com .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.