IDEAS home Printed from https://ideas.repec.org/a/spr/scient/v129y2024i11d10.1007_s11192-024-05173-2.html
   My bibliography  Save this article

From bench to bedside: determining what drives academic citations in clinical trials

Author

Listed:
  • Zhifeng Liu

    (Peking University)

  • Chenlin Wang

    (Peking University)

  • Ruojia Wang

    (Beijing University of Chinese Medicine)

Abstract

Academic research translates into improved health outcomes by influencing clinical trials that lead to changes in clinical practice. However, the process of incorporating academic research into clinical trials remains underexplored. This study aims to dissect the mechanisms underlying the translation of academic scholarship into clinical practice, focusing on how academic articles are cited within clinical trials. To begin with, we employed logistic regression to scrutinize the impact of paper-related features and author-related features on whether a paper was cited in clinical trials to quantify the relationships between these features and citation decisions, offering an initial insight into the primary factors shaping clinical citations. Expanding on this analysis, we adopted the SHAP explainable framework to probe how paper- and author-related features affect the decision to cite academic papers in clinical trials. The analysis reveals that citations are predominantly swayed by features of the paper rather than those related to the author. Specifically, the academic impact of a paper, including whether it is among the top 10% in terms of total citations within its respective field and publication year and the number of citations within the first year after being published, assumes a pivotal role. Moreover, features such as the number of references, authors, the h-index of the first author and their affiliated institution, and the number of institutions could facilitate citation in clinical trials. However, an intriguing finding is that the relationship between disruption and clinical citation follows an inverted U-shaped pattern. Our study enhances the understanding of how research is integrated into clinical trials, offering valuable insights for elevating the translational potential of scholarly articles and facilitating their inspirational role in clinical application.

Suggested Citation

  • Zhifeng Liu & Chenlin Wang & Ruojia Wang, 2024. "From bench to bedside: determining what drives academic citations in clinical trials," Scientometrics, Springer;Akadémiai Kiadó, vol. 129(11), pages 6813-6837, November.
  • Handle: RePEc:spr:scient:v:129:y:2024:i:11:d:10.1007_s11192-024-05173-2
    DOI: 10.1007/s11192-024-05173-2
    as

    Download full text from publisher

    File URL: http://link.springer.com/10.1007/s11192-024-05173-2
    File Function: Abstract
    Download Restriction: Access to the full text of the articles in this series is restricted.

    File URL: https://libkey.io/10.1007/s11192-024-05173-2?utm_source=ideas
    LibKey link: if access is restricted and if your library uses this service, LibKey will redirect you to where you can use your library subscription to access this item
    ---><---

    As the access to this document is restricted, you may want to search for a different version of it.

    References listed on IDEAS

    as
    1. Dongyu Zang & Chunli Liu, 2023. "Exploring the clinical translation intensity of papers published by the world’s top scientists in basic medicine," Scientometrics, Springer;Akadémiai Kiadó, vol. 128(4), pages 2371-2416, April.
    2. Lingfei Wu & Dashun Wang & James A. Evans, 2019. "Large teams develop and small teams disrupt science and technology," Nature, Nature, vol. 566(7744), pages 378-382, February.
    3. Mike Thelwall & Kayvan Kousha, 2016. "Are citations from clinical trials evidence of higher impact research? An analysis of ClinicalTrials.gov," Scientometrics, Springer;Akadémiai Kiadó, vol. 109(2), pages 1341-1351, November.
    4. Mike Thelwall & Nabeil Maflahi, 2016. "Guideline references and academic citations as evidence of the clinical value of health research," Journal of the Association for Information Science & Technology, Association for Information Science & Technology, vol. 67(4), pages 960-966, April.
    5. Wang, Jian & Veugelers, Reinhilde & Stephan, Paula, 2017. "Bias against novelty in science: A cautionary tale for users of bibliometric indicators," Research Policy, Elsevier, vol. 46(8), pages 1416-1436.
    6. Dongyu Zang & Chunli Liu, 2023. "Correction: Exploring the clinical translation intensity of papers published by the world’s top scientists in basic medicine," Scientometrics, Springer;Akadémiai Kiadó, vol. 128(4), pages 2417-2418, April.
    7. Magnus Eriksson & Annika Billhult & Tommy Billhult & Elena Pallari & Grant Lewison, 2020. "A new database of the references on international clinical practice guidelines: a facility for the evaluation of clinical research," Scientometrics, Springer;Akadémiai Kiadó, vol. 122(2), pages 1221-1235, February.
    8. Zhihong Huang & Qianjin Zong & Xuerui Ji, 2022. "The associations between scientific collaborations of LIS research and its policy impact," Scientometrics, Springer;Akadémiai Kiadó, vol. 127(11), pages 6453-6470, November.
    9. O. Mryglod & Yu. Holovatch & R. Kenna, 2022. "Big fish and small ponds: why the departmental h-index should not be used to rank universities," Scientometrics, Springer;Akadémiai Kiadó, vol. 127(6), pages 3279-3292, June.
    10. Feiheng Luo & Aixin Sun & Mojisola Erdt & Aravind Sesagiri Raamkumar & Yin-Leng Theng, 2018. "Exploring prestigious citations sourced from top universities in bibliometrics and altmetrics: a case study in the computer science discipline," Scientometrics, Springer;Akadémiai Kiadó, vol. 114(1), pages 1-17, January.
    11. B Ian Hutchins & Matthew T Davis & Rebecca A Meseroll & George M Santangelo, 2019. "Predicting translational progress in biomedical research," PLOS Biology, Public Library of Science, vol. 17(10), pages 1-25, October.
    12. Yeon Hak Kim & Aaron D. Levine & Eric J. Nehl & John P. Walsh, 2020. "A bibliometric measure of translational science," Scientometrics, Springer;Akadémiai Kiadó, vol. 125(3), pages 2349-2382, December.
    13. Liu, Jialin & Chen, Hongkan & Liu, Zhibo & Bu, Yi & Gu, Weiye, 2022. "Non-linearity between referencing behavior and citation impact: A large-scale, discipline-level analysis," Journal of Informetrics, Elsevier, vol. 16(3).
    14. Christopher Traylor & Christoph Herrmann-Lingen, 2023. "Does the journal impact factor reflect the impact of German medical guideline contributions?," Scientometrics, Springer;Akadémiai Kiadó, vol. 128(3), pages 1951-1962, March.
    15. Li, Xin & Tang, Xuli & Cheng, Qikai, 2022. "Predicting the clinical citation count of biomedical papers using multilayer perceptron neural network," Journal of Informetrics, Elsevier, vol. 16(4).
    Full references (including those not matched with items on IDEAS)

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Li, Xin & Tang, Xuli & Cheng, Qikai, 2022. "Predicting the clinical citation count of biomedical papers using multilayer perceptron neural network," Journal of Informetrics, Elsevier, vol. 16(4).
    2. Dongyu Zang & Chunli Liu, 2023. "Exploring the clinical translation intensity of papers published by the world’s top scientists in basic medicine," Scientometrics, Springer;Akadémiai Kiadó, vol. 128(4), pages 2371-2416, April.
    3. Li, Xin & Tang, Xuli & Lu, Wei, 2024. "Investigating clinical links in edge-labeled citation networks of biomedical research: A translational science perspective," Journal of Informetrics, Elsevier, vol. 18(3).
    4. Xin Li & Xuli Tang & Wei Lu, 2024. "How biomedical papers accumulated their clinical citations: a large-scale retrospective analysis based on PubMed," Scientometrics, Springer;Akadémiai Kiadó, vol. 129(6), pages 3315-3339, June.
    5. Xin Li & Xuli Tang & Wei Lu, 2023. "Tracking biomedical articles along the translational continuum: a measure based on biomedical knowledge representation," Scientometrics, Springer;Akadémiai Kiadó, vol. 128(2), pages 1295-1319, February.
    6. Guoqiang Liang & Ying Lou & Haiyan Hou, 2022. "Revisiting the disruptive index: evidence from the Nobel Prize-winning articles," Scientometrics, Springer;Akadémiai Kiadó, vol. 127(10), pages 5721-5730, October.
    7. Wu, Lingfei & Kittur, Aniket & Youn, Hyejin & Milojević, Staša & Leahey, Erin & Fiore, Stephen M. & Ahn, Yong-Yeol, 2022. "Metrics and mechanisms: Measuring the unmeasurable in the science of science," Journal of Informetrics, Elsevier, vol. 16(2).
    8. Sam Arts & Nicola Melluso & Reinhilde Veugelers, 2023. "Beyond Citations: Measuring Novel Scientific Ideas and their Impact in Publication Text," Papers 2309.16437, arXiv.org, revised Dec 2024.
    9. Diana Hicks & Julia Melkers & Kimberley R. Isett, 2019. "A characterization of professional media and its links to research," Scientometrics, Springer;Akadémiai Kiadó, vol. 119(2), pages 827-843, May.
    10. Zhentao Liang & Jin Mao & Gang Li, 2023. "Bias against scientific novelty: A prepublication perspective," Journal of the Association for Information Science & Technology, Association for Information Science & Technology, vol. 74(1), pages 99-114, January.
    11. Yue Wang & Ning Li & Bin Zhang & Qian Huang & Jian Wu & Yang Wang, 2023. "The effect of structural holes on producing novel and disruptive research in physics," Scientometrics, Springer;Akadémiai Kiadó, vol. 128(3), pages 1801-1823, March.
    12. Wang, Cheng-Jun & Yan, Lihan & Cui, Haochuan, 2023. "Unpacking the essential tension of knowledge recombination: Analyzing the impact of knowledge spanning on citation impact and disruptive innovation," Journal of Informetrics, Elsevier, vol. 17(4).
    13. Dongqing Lyu & Kaile Gong & Xuanmin Ruan & Ying Cheng & Jiang Li, 2021. "Does research collaboration influence the “disruption” of articles? Evidence from neurosciences," Scientometrics, Springer;Akadémiai Kiadó, vol. 126(1), pages 287-303, January.
    14. Shiyun Wang & Yaxue Ma & Jin Mao & Yun Bai & Zhentao Liang & Gang Li, 2023. "Quantifying scientific breakthroughs by a novel disruption indicator based on knowledge entities," Journal of the Association for Information Science & Technology, Association for Information Science & Technology, vol. 74(2), pages 150-167, February.
    15. Pierre Pelletier & Kevin Wirtz, 2023. "Sails and Anchors: The Complementarity of Exploratory and Exploitative Scientists in Knowledge Creation," Papers 2312.10476, arXiv.org.
    16. Min, Chao & Bu, Yi & Sun, Jianjun, 2021. "Predicting scientific breakthroughs based on knowledge structure variations," Technological Forecasting and Social Change, Elsevier, vol. 164(C).
    17. Yulin Yu & Daniel M. Romero, 2024. "Does the Use of Unusual Combinations of Datasets Contribute to Greater Scientific Impact?," Papers 2402.05024, arXiv.org, revised Sep 2024.
    18. Christian Catalini & Christian Fons-Rosen & Patrick Gaulé, 2020. "How Do Travel Costs Shape Collaboration?," Management Science, INFORMS, vol. 66(8), pages 3340-3360, August.
    19. Hou, Jianhua & Wang, Dongyi & Li, Jing, 2022. "A new method for measuring the originality of academic articles based on knowledge units in semantic networks," Journal of Informetrics, Elsevier, vol. 16(3).
    20. Keye Wu & Ziyue Xie & Jia Tina Du, 2024. "Does science disrupt technology? Examining science intensity, novelty, and recency through patent-paper citations in the pharmaceutical field," Scientometrics, Springer;Akadémiai Kiadó, vol. 129(9), pages 5469-5491, September.

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:spr:scient:v:129:y:2024:i:11:d:10.1007_s11192-024-05173-2. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: Sonal Shukla or Springer Nature Abstracting and Indexing (email available below). General contact details of provider: http://www.springer.com .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.