IDEAS home Printed from https://ideas.repec.org/a/eee/infome/v18y2024i1s1751157723001050.html
   My bibliography  Save this article

Unveiling the impact and dual innovation of funded research

Author

Listed:
  • Yang, Alex J.

Abstract

In the relentless pursuit of scientific advancement, comprehending the profound impact and innovation nature inherent in funded research projects assumes paramount significance. To illuminate this matter, I delve into the realm of research supported by the National Institutes of Health (NIH) and the National Science Foundation (NSF). The evaluative framework encompasses a spectrum of metrics, including citations by papers, patents, and Tweets, as markers of research impact. Moreover, I embrace ex-ante innovation (Novelty) and ex-post innovation (Disruption) as dual indispensable yardsticks for evaluating the innovative nature of research projects. Novelty denotes the manifestation of atypical combinations of existing knowledge, while Disruption signifies the extent of paradigm-shifting potential and the ability to exert a disruptive influence on future research endeavors. First, the analysis reveals that funded research projects manifest a conspicuously heightened impact in comparison to their non-funded counterparts. Second, I uncover a noteworthy finding: funded research demonstrates significantly higher levels of ex-ante innovation (Novelty). However, in a surprising twist, the impact of funding on ex-post innovation (Disruption) appears to be faint. Additionally, I undertake a meticulous scrutiny of the robustness of the research findings by scrutinizing patterns across years and fields. Despite the uneven distribution of NIH and NSF funded research and inconspicuous heterogeneity across fields, the patterns of the impact and dual innovation of funded research are consistent across almost all fields.

Suggested Citation

  • Yang, Alex J., 2024. "Unveiling the impact and dual innovation of funded research," Journal of Informetrics, Elsevier, vol. 18(1).
  • Handle: RePEc:eee:infome:v:18:y:2024:i:1:s1751157723001050
    DOI: 10.1016/j.joi.2023.101480
    as

    Download full text from publisher

    File URL: http://www.sciencedirect.com/science/article/pii/S1751157723001050
    Download Restriction: Full text for ScienceDirect subscribers only

    File URL: https://libkey.io/10.1016/j.joi.2023.101480?utm_source=ideas
    LibKey link: if access is restricted and if your library uses this service, LibKey will redirect you to where you can use your library subscription to access this item
    ---><---

    As the access to this document is restricted, you may want to search for a different version of it.

    References listed on IDEAS

    as
    1. Joshua D. Angrist & Jörn-Steffen Pischke, 2010. "The Credibility Revolution in Empirical Economics: How Better Research Design Is Taking the Con out of Econometrics," Journal of Economic Perspectives, American Economic Association, vol. 24(2), pages 3-30, Spring.
    2. Lingfei Wu & Dashun Wang & James A. Evans, 2019. "Large teams develop and small teams disrupt science and technology," Nature, Nature, vol. 566(7744), pages 378-382, February.
    3. Pierre Azoulay & Erica Fuchs & Anna P. Goldstein & Michael Kearney, 2018. "Funding Breakthrough Research: Promises and Challenges of the "ARPA Model"," NBER Chapters, in: Innovation Policy and the Economy, Volume 19, pages 69-96, National Bureau of Economic Research, Inc.
    4. Jian Gao & Yi-Cheng Zhang & Tao Zhou, 2019. "Computational Socioeconomics," Papers 1905.06166, arXiv.org.
    5. Matt Marx & Aaron Fuegi, 2022. "Reliance on science by inventors: Hybrid extraction of in‐text patent‐to‐article citations," Journal of Economics & Management Strategy, Wiley Blackwell, vol. 31(2), pages 369-392, April.
    6. Park, Hyunwoo & Lee, Jeongsik (Jay) & Kim, Byung-Cheol, 2015. "Project selection in NIH: A natural experiment from ARRA," Research Policy, Elsevier, vol. 44(6), pages 1145-1159.
    7. James G. March, 1991. "Exploration and Exploitation in Organizational Learning," Organization Science, INFORMS, vol. 2(1), pages 71-87, February.
    8. Pierre Azoulay & Joshua S. Graff Zivin & Gustavo Manso, 2011. "Incentives and creativity: evidence from the academic life sciences," RAND Journal of Economics, RAND Corporation, vol. 42(3), pages 527-554, September.
    9. repec:nas:journl:v:115:y:2018:p:2329-2334 is not listed on IDEAS
    10. Wang, Jian & Veugelers, Reinhilde & Stephan, Paula, 2017. "Bias against novelty in science: A cautionary tale for users of bibliometric indicators," Research Policy, Elsevier, vol. 46(8), pages 1416-1436.
    11. Mikko Packalen & Jay Bhattacharya, 2020. "NIH funding and the pursuit of edge science," Proceedings of the National Academy of Sciences, Proceedings of the National Academy of Sciences, vol. 117(22), pages 12011-12016, June.
    12. Pierre Azoulay & Joshua S Graff Zivin & Danielle Li & Bhaven N Sampat, 2019. "Public R&D Investments and Private-sector Patenting: Evidence from NIH Funding Rules," The Review of Economic Studies, Review of Economic Studies Ltd, vol. 86(1), pages 117-152.
    13. Shor, Boris & Bafumi, Joseph & Keele, Luke & Park, David, 2007. "A Bayesian Multilevel Modeling Approach to Time-Series Cross-Sectional Data," Political Analysis, Cambridge University Press, vol. 15(2), pages 165-181, April.
    14. Michael Park & Erin Leahey & Russell J. Funk, 2023. "Papers and patents are becoming less disruptive over time," Nature, Nature, vol. 613(7942), pages 138-144, January.
    15. Pierre Azoulay, 2012. "Turn the scientific method on ourselves," Nature, Nature, vol. 484(7392), pages 31-32, April.
    16. Yang, Alex Jie & Wu, Linwei & Zhang, Qi & Wang, Hao & Deng, Sanhong, 2023. "The k-step h-index in citation networks at the paper, author, and institution levels," Journal of Informetrics, Elsevier, vol. 17(4).
    17. Bornmann, Lutz & Tekles, Alexander & Zhang, Helena H. & Ye, Fred Y., 2019. "Do we measure novelty when we analyze unusual combinations of cited references? A validation study of bibliometric novelty indicators based on F1000Prime data," Journal of Informetrics, Elsevier, vol. 13(4).
    18. Loet Leydesdorff & Lutz Bornmann & Caroline S. Wagner, 2019. "The Relative Influences of Government Funding and International Collaboration on Citation Impact," Journal of the Association for Information Science & Technology, Association for Information Science & Technology, vol. 70(2), pages 198-201, February.
    19. Loet Leydesdorff, 2018. "Diversity and interdisciplinarity: how can one distinguish and recombine disparity, variety, and balance?," Scientometrics, Springer;Akadémiai Kiadó, vol. 116(3), pages 2113-2121, September.
    20. Yian Yin & Yang Wang & James A. Evans & Dashun Wang, 2019. "Quantifying the dynamics of failure across science, startups and security," Nature, Nature, vol. 575(7781), pages 190-194, November.
    21. Trapido, Denis, 2015. "How novelty in knowledge earns recognition: The role of consistent identities," Research Policy, Elsevier, vol. 44(8), pages 1488-1500.
    22. Feliciani, Thomas & Morreau, Michael & Luo, Junwen & Lucas, Pablo & Shankar, Kalpana, 2022. "Designing grant-review panels for better funding decisions: Lessons from an empirically calibrated simulation model," Research Policy, Elsevier, vol. 51(4).
    23. Fengli Xu & Lingfei Wu & James Evans, 2022. "Flat teams drive scientific innovation," Proceedings of the National Academy of Sciences, Proceedings of the National Academy of Sciences, vol. 119(23), pages 2200927119-, June.
    24. Wagner, Caroline S. & Whetsell, Travis A. & Mukherjee, Satyam, 2019. "International research collaboration: Novelty, conventionality, and atypicality in knowledge recombination," Research Policy, Elsevier, vol. 48(5), pages 1260-1270.
    25. Andy Stirling, 2007. "A General Framework for Analysing Diversity in Science, Technology and Society," SPRU Working Paper Series 156, SPRU - Science Policy Research Unit, University of Sussex Business School.
    26. Fontana, Magda & Iori, Martina & Montobbio, Fabio & Sinatra, Roberta, 2020. "New and atypical combinations: An assessment of novelty and interdisciplinarity," Research Policy, Elsevier, vol. 49(7).
    27. Bas Hofstra & Vivek V. Kulkarni & Sebastian Munoz-Najar Galvez & Bryan He & Dan Jurafsky & Daniel A. McFarland, 2020. "The Diversity–Innovation Paradox in Science," Proceedings of the National Academy of Sciences, Proceedings of the National Academy of Sciences, vol. 117(17), pages 9284-9291, April.
    28. Leydesdorff, Loet & Wagner, Caroline S. & Bornmann, Lutz, 2019. "Interdisciplinarity as diversity in citation patterns among journals: Rao-Stirling diversity, relative variety, and the Gini coefficient," Journal of Informetrics, Elsevier, vol. 13(1), pages 255-269.
    29. Jian Wang, 2013. "Citation time window choice for research impact evaluation," Scientometrics, Springer;Akadémiai Kiadó, vol. 94(3), pages 851-872, March.
    30. Pierre Azoulay & Erica Fuchs & Anna Goldstein & Michael Kearney, 2018. "Funding Breakthrough Research: Promises and Challenges of the “ARPA Model”," NBER Working Papers 24674, National Bureau of Economic Research, Inc.
    31. Chen, Jiyao & Shao, Diana & Fan, Shaokun, 2021. "Destabilization and consolidation: Conceptualizing, measuring, and validating the dual characteristics of technology," Research Policy, Elsevier, vol. 50(1).
    32. Russell J. Funk & Jason Owen-Smith, 2017. "A Dynamic Network Measure of Technological Change," Management Science, INFORMS, vol. 63(3), pages 791-817, March.
    33. Yian Yin & Yuxiao Dong & Kuansan Wang & Dashun Wang & Benjamin F. Jones, 2022. "Public use and public funding of science," Nature Human Behaviour, Nature, vol. 6(10), pages 1344-1350, October.
    34. Feng Shi & James Evans, 2023. "Surprising combinations of research contents and contexts are related to impact and emerge with scientific outsiders from distant disciplines," Nature Communications, Nature, vol. 14(1), pages 1-13, December.
    35. Lu Liu & Benjamin F. Jones & Brian Uzzi & Dashun Wang, 2023. "Data, measurement and empirical methods in the science of science," Nature Human Behaviour, Nature, vol. 7(7), pages 1046-1058, July.
    36. Matt Marx & Aaron Fuegi, 2020. "Reliance on science: Worldwide front‐page patent citations to scientific articles," Strategic Management Journal, Wiley Blackwell, vol. 41(9), pages 1572-1594, September.
    37. Danielle Li, 2017. "Expertise versus Bias in Evaluation: Evidence from the NIH," American Economic Journal: Applied Economics, American Economic Association, vol. 9(2), pages 60-92, April.
    38. Paula Stephan & Reinhilde Veugelers & Jian Wang, 2017. "Reviewers are blinkered by bibliometrics," Nature, Nature, vol. 544(7651), pages 411-412, April.
    39. Wang, Jian & Lee, You-Na & Walsh, John P., 2018. "Funding model and creativity in science: Competitive versus block funding and status contingency effects," Research Policy, Elsevier, vol. 47(6), pages 1070-1083.
    40. Johan S. G. Chu & James A. Evans, 2021. "Slowed canonical progress in large fields of science," Proceedings of the National Academy of Sciences, Proceedings of the National Academy of Sciences, vol. 118(41), pages 2021636118-, October.
    41. Waltman, Ludo, 2016. "A review of the literature on citation impact indicators," Journal of Informetrics, Elsevier, vol. 10(2), pages 365-391.
    Full references (including those not matched with items on IDEAS)

    Citations

    Citations are extracted by the CitEc Project, subscribe to its RSS feed for this item.
    as


    Cited by:

    1. Alex J. Yang & Huimin Xu & Ying Ding & Meijun Liu, 2024. "Unveiling the dynamics of team age structure and its impact on scientific innovation," Scientometrics, Springer;Akadémiai Kiadó, vol. 129(10), pages 6127-6148, October.
    2. Wei Cheng & Dejun Zheng & Shaoxiong Fu & Jingfeng Cui, 2024. "Closer in time and higher correlation: disclosing the relationship between citation similarity and citation interval," Scientometrics, Springer;Akadémiai Kiadó, vol. 129(7), pages 4495-4512, July.

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Lu Liu & Benjamin F. Jones & Brian Uzzi & Dashun Wang, 2023. "Data, measurement and empirical methods in the science of science," Nature Human Behaviour, Nature, vol. 7(7), pages 1046-1058, July.
    2. Yang, Alex Jie & Wu, Linwei & Zhang, Qi & Wang, Hao & Deng, Sanhong, 2023. "The k-step h-index in citation networks at the paper, author, and institution levels," Journal of Informetrics, Elsevier, vol. 17(4).
    3. Alex J. Yang & Huimin Xu & Ying Ding & Meijun Liu, 2024. "Unveiling the dynamics of team age structure and its impact on scientific innovation," Scientometrics, Springer;Akadémiai Kiadó, vol. 129(10), pages 6127-6148, October.
    4. Kwon, Seokbeom, 2022. "Interdisciplinary knowledge integration as a unique knowledge source for technology development and the role of funding allocation," Technological Forecasting and Social Change, Elsevier, vol. 181(C).
    5. Yuefen Wang & Lipeng Fan & Lei Wu, 2024. "A validation test of the Uzzi et al. novelty measure of innovation and applications to collaboration patterns between institutions," Scientometrics, Springer;Akadémiai Kiadó, vol. 129(7), pages 4379-4394, July.
    6. Sam Arts & Nicola Melluso & Reinhilde Veugelers, 2023. "Beyond Citations: Measuring Novel Scientific Ideas and their Impact in Publication Text," Papers 2309.16437, arXiv.org, revised Dec 2024.
    7. Yue Wang & Ning Li & Bin Zhang & Qian Huang & Jian Wu & Yang Wang, 2023. "The effect of structural holes on producing novel and disruptive research in physics," Scientometrics, Springer;Akadémiai Kiadó, vol. 128(3), pages 1801-1823, March.
    8. Zhang, Yang & Wang, Yang & Du, Haifeng & Havlin, Shlomo, 2024. "Delayed citation impact of interdisciplinary research," Journal of Informetrics, Elsevier, vol. 18(1).
    9. Guo, Liying & Wang, Yang & Li, Meiling, 2024. "Exploration, exploitation and funding success: Evidence from junior scientists supported by the Chinese Young Scientists Fund," Journal of Informetrics, Elsevier, vol. 18(2).
    10. Sotaro Shibayama & Jian Wang, 2020. "Measuring originality in science," Scientometrics, Springer;Akadémiai Kiadó, vol. 122(1), pages 409-427, January.
    11. Wu, Lingfei & Kittur, Aniket & Youn, Hyejin & Milojević, Staša & Leahey, Erin & Fiore, Stephen M. & Ahn, Yong-Yeol, 2022. "Metrics and mechanisms: Measuring the unmeasurable in the science of science," Journal of Informetrics, Elsevier, vol. 16(2).
    12. Pierre Pelletier & Kevin Wirtz, 2023. "Sails and Anchors: The Complementarity of Exploratory and Exploitative Scientists in Knowledge Creation," Papers 2312.10476, arXiv.org.
    13. Hou, Jianhua & Wang, Dongyi & Li, Jing, 2022. "A new method for measuring the originality of academic articles based on knowledge units in semantic networks," Journal of Informetrics, Elsevier, vol. 16(3).
    14. Keye Wu & Ziyue Xie & Jia Tina Du, 2024. "Does science disrupt technology? Examining science intensity, novelty, and recency through patent-paper citations in the pharmaceutical field," Scientometrics, Springer;Akadémiai Kiadó, vol. 129(9), pages 5469-5491, September.
    15. Ziyan Zhang & Junyan Zhang & Pushi Wang, 2024. "Measurement of disruptive innovation and its validity based on improved disruption index," Scientometrics, Springer;Akadémiai Kiadó, vol. 129(11), pages 6477-6531, November.
    16. Giulio Giacomo Cantone, 2024. "How to measure interdisciplinary research? A systemic design for the model of measurement," Scientometrics, Springer;Akadémiai Kiadó, vol. 129(8), pages 4937-4982, August.
    17. Hou, Jianhua & Li, Hao & Zhang, Yang, 2024. "Influence of interdisciplinarity of scientific papers on the durability of citation diffusion: A perspective from citation discontinuance," Journal of Informetrics, Elsevier, vol. 18(3).
    18. Shiji Chen & Yanhui Song & Fei Shu & Vincent Larivière, 2022. "Interdisciplinarity and impact: the effects of the citation time window," Scientometrics, Springer;Akadémiai Kiadó, vol. 127(5), pages 2621-2642, May.
    19. Wang, Cheng-Jun & Yan, Lihan & Cui, Haochuan, 2023. "Unpacking the essential tension of knowledge recombination: Analyzing the impact of knowledge spanning on citation impact and disruptive innovation," Journal of Informetrics, Elsevier, vol. 17(4).
    20. Xin Liu & Yi Bu & Ming Li & Jiang Li, 2024. "Monodisciplinary collaboration disrupts science more than multidisciplinary collaboration," Journal of the Association for Information Science & Technology, Association for Information Science & Technology, vol. 75(1), pages 59-78, January.

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:eee:infome:v:18:y:2024:i:1:s1751157723001050. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: Catherine Liu (email available below). General contact details of provider: http://www.elsevier.com/locate/joi .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.