IDEAS home Printed from https://ideas.repec.org/a/spr/scient/v129y2024i11d10.1007_s11192-024-05150-9.html
   My bibliography  Save this article

An integrated indicator for evaluating scientific papers: considering academic impact and novelty

Author

Listed:
  • Zhaoping Yan

    (Nanjing University)

  • Kaiyu Fan

    (The Chinese University of Hong Kong - Shenzhen)

Abstract

The assessment of scientific papers has long been a challenging issue. Although numerous studies have proposed quantitative indicators for assessing scientific papers, these studies overlooked the citation characteristics and the novelty of scientific knowledge implied in the textual information of papers. Therefore, this paper constructs an integrated indicator to evaluate scientific papers from both citation and semantic perspectives. Firstly, we propose weighted citations to measure the academic impact of scientific papers, which takes time heterogeneity and citation sentiment factors into consideration. Secondly, we capture the novelty of scientific papers from a semantic perspective, utilizing FastText to represent papers as text embeddings and applying the local outlier factor to calculate it. To validate the performance of our approach, the bullwhip effect domain and the ACL Anthology corpus are used for case studies. The results demonstrate that our indicator can effectively identify outstanding papers, thus providing a more comprehensive evaluation method for evaluating academic research.

Suggested Citation

  • Zhaoping Yan & Kaiyu Fan, 2024. "An integrated indicator for evaluating scientific papers: considering academic impact and novelty," Scientometrics, Springer;Akadémiai Kiadó, vol. 129(11), pages 6909-6929, November.
  • Handle: RePEc:spr:scient:v:129:y:2024:i:11:d:10.1007_s11192-024-05150-9
    DOI: 10.1007/s11192-024-05150-9
    as

    Download full text from publisher

    File URL: http://link.springer.com/10.1007/s11192-024-05150-9
    File Function: Abstract
    Download Restriction: Access to the full text of the articles in this series is restricted.

    File URL: https://libkey.io/10.1007/s11192-024-05150-9?utm_source=ideas
    LibKey link: if access is restricted and if your library uses this service, LibKey will redirect you to where you can use your library subscription to access this item
    ---><---

    As the access to this document is restricted, you may want to search for a different version of it.

    References listed on IDEAS

    as
    1. Dejian Yu & Zhaoping Yan, 2022. "Combining machine learning and main path analysis to identify research front: from the perspective of science-technology linkage," Scientometrics, Springer;Akadémiai Kiadó, vol. 127(7), pages 4251-4274, July.
    2. Stegehuis, Clara & Litvak, Nelly & Waltman, Ludo, 2015. "Predicting the long-term citation impact of recent publications," Journal of Informetrics, Elsevier, vol. 9(3), pages 642-657.
    3. Bornmann, Lutz & Tekles, Alexander & Zhang, Helena H. & Ye, Fred Y., 2019. "Do we measure novelty when we analyze unusual combinations of cited references? A validation study of bibliometric novelty indicators based on F1000Prime data," Journal of Informetrics, Elsevier, vol. 13(4).
    4. Ruan, Xuanmin & Zhu, Yuanyang & Li, Jiang & Cheng, Ying, 2020. "Predicting the citation counts of individual papers via a BP neural network," Journal of Informetrics, Elsevier, vol. 14(3).
    5. Luo, Zhuoran & Lu, Wei & He, Jiangen & Wang, Yuqi, 2022. "Combination of research questions and methods: A new measurement of scientific novelty," Journal of Informetrics, Elsevier, vol. 16(2).
    6. Tove Faber Frandsen & Jeppe Nicolaisen, 2013. "The ripple effect: Citation chain reactions of a nobel prize," Journal of the American Society for Information Science and Technology, Association for Information Science & Technology, vol. 64(3), pages 437-447, March.
    7. Yu, Dejian & Yan, Zhaoping, 2023. "Main path analysis considering citation structure and content: Case studies in different domains," Journal of Informetrics, Elsevier, vol. 17(1).
    8. Purkayastha, Amrita & Palmaro, Eleonora & Falk-Krzesinski, Holly J. & Baas, Jeroen, 2019. "Comparison of two article-level, field-independent citation metrics: Field-Weighted Citation Impact (FWCI) and Relative Citation Ratio (RCR)," Journal of Informetrics, Elsevier, vol. 13(2), pages 635-642.
    9. Russell J. Funk & Jason Owen-Smith, 2017. "A Dynamic Network Measure of Technological Change," Management Science, INFORMS, vol. 63(3), pages 791-817, March.
    10. Fuli Zhang, 2017. "Evaluating journal impact based on weighted citations," Scientometrics, Springer;Akadémiai Kiadó, vol. 113(2), pages 1155-1169, November.
    11. Persson, Olle, 2010. "Identifying research themes with weighted direct citation links," Journal of Informetrics, Elsevier, vol. 4(3), pages 415-422.
    12. Veugelers, Reinhilde & Wang, Jian, 2019. "Scientific novelty and technological impact," Research Policy, Elsevier, vol. 48(6), pages 1362-1372.
    13. Claire Donovan, 2007. "Introduction: Future pathways for science policy and research assessment: Metrics vs peer review, quality vs impact," Science and Public Policy, Oxford University Press, vol. 34(8), pages 538-542, October.
    14. Xiaodan Zhu & Peter Turney & Daniel Lemire & André Vellino, 2015. "Measuring academic influence: Not all citations are equal," Journal of the Association for Information Science & Technology, Association for Information Science & Technology, vol. 66(2), pages 408-427, February.
    15. Jian Wang, 2013. "Citation time window choice for research impact evaluation," Scientometrics, Springer;Akadémiai Kiadó, vol. 94(3), pages 851-872, March.
    16. Tove Faber Frandsen & Jeppe Nicolaisen, 2013. "The ripple effect: Citation chain reactions of a nobel prize," Journal of the Association for Information Science & Technology, Association for Information Science & Technology, vol. 64(3), pages 437-447, March.
    17. Waltman, Ludo, 2016. "A review of the literature on citation impact indicators," Journal of Informetrics, Elsevier, vol. 10(2), pages 365-391.
    18. Lingfei Wu & Dashun Wang & James A. Evans, 2019. "Large teams develop and small teams disrupt science and technology," Nature, Nature, vol. 566(7744), pages 378-382, February.
    19. Leo Egghe, 2006. "Theory and practise of the g-index," Scientometrics, Springer;Akadémiai Kiadó, vol. 69(1), pages 131-152, October.
    20. Shiyun Wang & Yaxue Ma & Jin Mao & Yun Bai & Zhentao Liang & Gang Li, 2023. "Quantifying scientific breakthroughs by a novel disruption indicator based on knowledge entities," Journal of the Association for Information Science & Technology, Association for Information Science & Technology, vol. 74(2), pages 150-167, February.
    21. Henk F. Moed & Gali Halevi, 2015. "Multidimensional assessment of scholarly research impact," Journal of the Association for Information Science & Technology, Association for Information Science & Technology, vol. 66(10), pages 1988-2002, October.
    22. Rudolf Farys & Tobias Wolbring, 2017. "Matched control groups for modeling events in citation data: An illustration of nobel prize effects in citation networks," Journal of the Association for Information Science & Technology, Association for Information Science & Technology, vol. 68(9), pages 2201-2210, September.
    23. Yuanyuan Liu & Qiang Wu & Shijie Wu & Yong Gao, 2021. "Weighted citation based on ranking-related contribution: a new index for evaluating article impact," Scientometrics, Springer;Akadémiai Kiadó, vol. 126(10), pages 8653-8672, October.
    24. John S. Liu & Louis Y. Y. Lu & Mei Hsiu-Ching Ho, 2019. "A few notes on main path analysis," Scientometrics, Springer;Akadémiai Kiadó, vol. 119(1), pages 379-391, April.
    25. Lee, You-Na & Walsh, John P. & Wang, Jian, 2015. "Creativity in scientific teams: Unpacking novelty and impact," Research Policy, Elsevier, vol. 44(3), pages 684-697.
    26. Yu, Dejian & Pan, Tianxing, 2021. "Tracing the main path of interdisciplinary research considering citation preference: A case from blockchain domain," Journal of Informetrics, Elsevier, vol. 15(2).
    27. Nassiri, Isar & Masoudi-Nejad, Ali & Jalili, Mahdi & Moeini, Ali, 2013. "Normalized Similarity Index: An adjusted index to prioritize article citations," Journal of Informetrics, Elsevier, vol. 7(1), pages 91-98.
    28. Dunaiski, Marcel & Visser, Willem & Geldenhuys, Jaco, 2016. "Evaluating paper and author ranking algorithms using impact and contribution awards," Journal of Informetrics, Elsevier, vol. 10(2), pages 392-407.
    29. Chai, Sen & Menon, Anoop, 2019. "Breakthrough recognition: Bias against novelty and competition for attention," Research Policy, Elsevier, vol. 48(3), pages 733-747.
    30. Monachary Kammari & Durga Bhavani S, 2023. "Time-stamp based network evolution model for citation networks," Scientometrics, Springer;Akadémiai Kiadó, vol. 128(6), pages 3723-3741, June.
    31. Diana Hicks & Paul Wouters & Ludo Waltman & Sarah de Rijcke & Ismael Rafols, 2015. "Bibliometrics: The Leiden Manifesto for research metrics," Nature, Nature, vol. 520(7548), pages 429-431, April.
    32. Chan, Ho Fai & Frey, Bruno S. & Gallus, Jana & Torgler, Benno, 2014. "Academic honors and performance," Labour Economics, Elsevier, vol. 31(C), pages 188-204.
    Full references (including those not matched with items on IDEAS)

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Libo Sheng & Dongqing Lyu & Xuanmin Ruan & Hongquan Shen & Ying Cheng, 2023. "The association between prior knowledge and the disruption of an article," Scientometrics, Springer;Akadémiai Kiadó, vol. 128(8), pages 4731-4751, August.
    2. Wang, Xiaoli & Liang, Wenting & Ye, Xuanting & Chen, Lingdi & Liu, Yun, 2024. "Disruptive development path measurement for emerging technologies based on the patent citation network," Journal of Informetrics, Elsevier, vol. 18(1).
    3. Xin Liu & Yi Bu & Ming Li & Jiang Li, 2024. "Monodisciplinary collaboration disrupts science more than multidisciplinary collaboration," Journal of the Association for Information Science & Technology, Association for Information Science & Technology, vol. 75(1), pages 59-78, January.
    4. Liu, Meijun & Jaiswal, Ajay & Bu, Yi & Min, Chao & Yang, Sijie & Liu, Zhibo & Acuña, Daniel & Ding, Ying, 2022. "Team formation and team impact: The balance between team freshness and repeat collaboration," Journal of Informetrics, Elsevier, vol. 16(4).
    5. Yuefen Wang & Lipeng Fan & Lei Wu, 2024. "A validation test of the Uzzi et al. novelty measure of innovation and applications to collaboration patterns between institutions," Scientometrics, Springer;Akadémiai Kiadó, vol. 129(7), pages 4379-4394, July.
    6. Ziyan Zhang & Junyan Zhang & Pushi Wang, 2024. "Measurement of disruptive innovation and its validity based on improved disruption index," Scientometrics, Springer;Akadémiai Kiadó, vol. 129(11), pages 6477-6531, November.
    7. Yang, Alex J., 2024. "Unveiling the impact and dual innovation of funded research," Journal of Informetrics, Elsevier, vol. 18(1).
    8. Sam Arts & Nicola Melluso & Reinhilde Veugelers, 2023. "Beyond Citations: Measuring Novel Scientific Ideas and their Impact in Publication Text," Papers 2309.16437, arXiv.org, revised Dec 2024.
    9. Zhentao Liang & Jin Mao & Gang Li, 2023. "Bias against scientific novelty: A prepublication perspective," Journal of the Association for Information Science & Technology, Association for Information Science & Technology, vol. 74(1), pages 99-114, January.
    10. Yang, Alex Jie & Wu, Linwei & Zhang, Qi & Wang, Hao & Deng, Sanhong, 2023. "The k-step h-index in citation networks at the paper, author, and institution levels," Journal of Informetrics, Elsevier, vol. 17(4).
    11. Zhang, Fang & Wu, Shengli, 2020. "Predicting future influence of papers, researchers, and venues in a dynamic academic network," Journal of Informetrics, Elsevier, vol. 14(2).
    12. Bornmann, Lutz & Haunschild, Robin & Mutz, Rüdiger, 2020. "Should citations be field-normalized in evaluative bibliometrics? An empirical analysis based on propensity score matching," Journal of Informetrics, Elsevier, vol. 14(4).
    13. Bornmann, Lutz & Tekles, Alexander, 2021. "Convergent validity of several indicators measuring disruptiveness with milestone assignments to physics papers by experts," Journal of Informetrics, Elsevier, vol. 15(3).
    14. Yu, Dejian & Sheng, Libo, 2021. "Influence difference main path analysis: Evidence from DNA and blockchain domain citation networks," Journal of Informetrics, Elsevier, vol. 15(4).
    15. Lutz Bornmann & Klaus Wohlrabe, 2019. "Normalisation of citation impact in economics," Scientometrics, Springer;Akadémiai Kiadó, vol. 120(2), pages 841-884, August.
    16. Ruijie Wang & Yuhao Zhou & An Zeng, 2023. "Evaluating scientists by citation and disruption of their representative works," Scientometrics, Springer;Akadémiai Kiadó, vol. 128(3), pages 1689-1710, March.
    17. Loet Leydesdorff & Paul Wouters & Lutz Bornmann, 2016. "Professional and citizen bibliometrics: complementarities and ambivalences in the development and use of indicators—a state-of-the-art report," Scientometrics, Springer;Akadémiai Kiadó, vol. 109(3), pages 2129-2150, December.
    18. Lu Liu & Benjamin F. Jones & Brian Uzzi & Dashun Wang, 2023. "Data, measurement and empirical methods in the science of science," Nature Human Behaviour, Nature, vol. 7(7), pages 1046-1058, July.
    19. Guoqiang Liang & Ying Lou & Haiyan Hou, 2022. "Revisiting the disruptive index: evidence from the Nobel Prize-winning articles," Scientometrics, Springer;Akadémiai Kiadó, vol. 127(10), pages 5721-5730, October.
    20. Jeon, Daeseong & Lee, Junyoup & Ahn, Joon Mo & Lee, Changyong, 2023. "Measuring the novelty of scientific publications: A fastText and local outlier factor approach," Journal of Informetrics, Elsevier, vol. 17(4).

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:spr:scient:v:129:y:2024:i:11:d:10.1007_s11192-024-05150-9. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: Sonal Shukla or Springer Nature Abstracting and Indexing (email available below). General contact details of provider: http://www.springer.com .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.