IDEAS home Printed from https://ideas.repec.org/a/spr/scient/v128y2023i4d10.1007_s11192-023-04669-7.html
   My bibliography  Save this article

Analyzing the influence of prolific collaborations on authors productivity and visibility

Author

Listed:
  • Ana C. M. Brito

    (University of São Paulo)

  • Filipi N. Silva

    (Indiana University Network Science Institute)

  • Diego R. Amancio

    (University of São Paulo)

Abstract

Science has become more collaborative over the past years, as evidenced by the growing number of authors per publication and the emergence of interdisciplinary research endeavors involving specialists from different fields. In this context, it is not trivial to quantify the individual impact of researchers. To address this issue, we evaluate the effect of the most productive collaboration tie (as measured by the number of co-authored papers) on the productivity and visibility metrics of established researchers. We analyzed the impact on the researcher’s metrics, such as the number of publications, citations, and h-index, when their co-authored works with their most productive collaborator were excluded from the analysis. A comprehensive analysis conducted utilizing over 243 million papers revealed different patterns of prolific collaborator influence across the major fields of knowledge. In formal and applied sciences, the impact of prolific collaborators on the visibility metrics of authors is substantial, even among those who are highly cited. These results have significant implications for stakeholders who are seeking to understand collaboration patterns and to develop measures of success that consider collaboration ties.

Suggested Citation

  • Ana C. M. Brito & Filipi N. Silva & Diego R. Amancio, 2023. "Analyzing the influence of prolific collaborations on authors productivity and visibility," Scientometrics, Springer;Akadémiai Kiadó, vol. 128(4), pages 2471-2487, April.
  • Handle: RePEc:spr:scient:v:128:y:2023:i:4:d:10.1007_s11192-023-04669-7
    DOI: 10.1007/s11192-023-04669-7
    as

    Download full text from publisher

    File URL: http://link.springer.com/10.1007/s11192-023-04669-7
    File Function: Abstract
    Download Restriction: Access to the full text of the articles in this series is restricted.

    File URL: https://libkey.io/10.1007/s11192-023-04669-7?utm_source=ideas
    LibKey link: if access is restricted and if your library uses this service, LibKey will redirect you to where you can use your library subscription to access this item
    ---><---

    As the access to this document is restricted, you may want to search for a different version of it.

    References listed on IDEAS

    as
    1. Isola Ajiferuke & Kun Lu & Dietmar Wolfram, 2010. "A comparison of citer and citation‐based measure outcomes for multiple disciplines," Journal of the American Society for Information Science and Technology, Association for Information Science & Technology, vol. 61(10), pages 2086-2096, October.
    2. Vincent Larivière & Yves Gingras & Cassidy R. Sugimoto & Andrew Tsou, 2015. "Team size matters: Collaboration and scientific impact since 1900," Journal of the Association for Information Science & Technology, Association for Information Science & Technology, vol. 66(7), pages 1323-1332, July.
    3. John P A Ioannidis & Kevin W Boyack & Jeroen Baas, 2020. "Updated science-wide author databases of standardized citation indicators," PLOS Biology, Public Library of Science, vol. 18(10), pages 1-3, October.
    4. Xiomara S. Q. Chacon & Thiago C. Silva & Diego R. Amancio, 2020. "Comparing the impact of subfields in scientific journals," Scientometrics, Springer;Akadémiai Kiadó, vol. 125(1), pages 625-639, October.
    5. Isola Ajiferuke & Dietmar Wolfram, 2010. "Citer analysis as a measure of research impact: library and information science as a case study," Scientometrics, Springer;Akadémiai Kiadó, vol. 83(3), pages 623-638, June.
    6. Donald Deb. Beaver, 2001. "Reflections on Scientific Collaboration (and its study): Past, Present, and Future," Scientometrics, Springer;Akadémiai Kiadó, vol. 52(3), pages 365-377, November.
    7. Isola Ajiferuke & Kun Lu & Dietmar Wolfram, 2010. "A comparison of citer and citation-based measure outcomes for multiple disciplines," Journal of the Association for Information Science & Technology, Association for Information Science & Technology, vol. 61(10), pages 2086-2096, October.
    8. Sven E. Hug & Martin P. Brändle, 2017. "The coverage of Microsoft Academic: analyzing the publication output of a university," Scientometrics, Springer;Akadémiai Kiadó, vol. 113(3), pages 1551-1571, December.
    9. Richard B. Freeman & Ina Ganguli & Raviv Murciano-Goroff, 2014. "Why and Wherefore of Increased Scientific Collaboration," NBER Chapters, in: The Changing Frontier: Rethinking Science and Innovation Policy, pages 17-48, National Bureau of Economic Research, Inc.
    10. Giovanni Abramo & Ciriaco Andrea D’Angelo, 2014. "How do you define and measure research productivity?," Scientometrics, Springer;Akadémiai Kiadó, vol. 101(2), pages 1129-1144, November.
    11. Silva, Filipi N. & Amancio, Diego R. & Bardosova, Maria & Costa, Luciano da F. & Oliveira, Osvaldo N., 2016. "Using network science and text analytics to produce surveys in a scientific topic," Journal of Informetrics, Elsevier, vol. 10(2), pages 487-502.
    12. Abramo, Giovanni & D’Angelo, Ciriaco Andrea, 2015. "The relationship between the number of authors of a publication, its citations and the impact factor of the publishing journal: Evidence from Italy," Journal of Informetrics, Elsevier, vol. 9(4), pages 746-761.
    13. Brito, Ana C.M. & Silva, Filipi N. & Amancio, Diego R., 2021. "Associations between author-level metrics in subsequent time periods," Journal of Informetrics, Elsevier, vol. 15(4).
    14. Pablo D. Batista & Mônica G. Campiteli & Osame Kinouchi, 2006. "Is it possible to compare researchers with different scientific interests?," Scientometrics, Springer;Akadémiai Kiadó, vol. 68(1), pages 179-189, July.
    15. Thelwall, Mike, 2017. "Microsoft Academic: A multidisciplinary comparison of citation counts with Scopus and Mendeley for 29 journals," Journal of Informetrics, Elsevier, vol. 11(4), pages 1201-1212.
    16. Anne-Wil Harzing, 2016. "Microsoft Academic (Search): a Phoenix arisen from the ashes?," Scientometrics, Springer;Akadémiai Kiadó, vol. 108(3), pages 1637-1647, September.
    17. M. Pelacho & G. Ruiz & F. Sanz & A. Tarancón & J. Clemente-Gallardo, 2021. "Analysis of the evolution and collaboration networks of citizen science scientific publications," Scientometrics, Springer;Akadémiai Kiadó, vol. 126(1), pages 225-257, January.
    18. Jian Qin & F. W. Lancaster & Bryce Allen, 1997. "Types and levels of collaboration in interdisciplinary research in the sciences," Journal of the American Society for Information Science, Association for Information Science & Technology, vol. 48(10), pages 893-916, October.
    19. Corrêa Jr., Edilson A. & Silva, Filipi N. & da F. Costa, Luciano & Amancio, Diego R., 2017. "Patterns of authors contribution in scientific manuscripts," Journal of Informetrics, Elsevier, vol. 11(2), pages 498-510.
    20. Abramo, Giovanni & Cicero, Tindaro & D’Angelo, Ciriaco Andrea, 2013. "Individual research performance: A proposal for comparing apples to oranges," Journal of Informetrics, Elsevier, vol. 7(2), pages 528-539.
    21. Tohalino, Jorge V. & Amancio, Diego R., 2018. "Extractive multi-document summarization using multilayer networks," Physica A: Statistical Mechanics and its Applications, Elsevier, vol. 503(C), pages 526-539.
    22. Viana, Matheus P. & Amancio, Diego R. & da F. Costa, Luciano, 2013. "On time-varying collaboration networks," Journal of Informetrics, Elsevier, vol. 7(2), pages 371-378.
    Full references (including those not matched with items on IDEAS)

    Citations

    Citations are extracted by the CitEc Project, subscribe to its RSS feed for this item.
    as


    Cited by:

    1. Fabio Zagonari & Paolo Foschi, 2024. "Coping with the Inequity and Inefficiency of the H-Index: A Cross-Disciplinary Empirical Analysis," Publications, MDPI, vol. 12(2), pages 1-30, April.

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Tohalino, Jorge A.V. & Amancio, Diego R., 2022. "On predicting research grants productivity via machine learning," Journal of Informetrics, Elsevier, vol. 16(2).
    2. Jingda Ding & Chao Liu & Goodluck Asobenie Kandonga, 2020. "Exploring the limitations of the h-index and h-type indexes in measuring the research performance of authors," Scientometrics, Springer;Akadémiai Kiadó, vol. 122(3), pages 1303-1322, March.
    3. Corrêa, Edilson A. & Marinho, Vanessa Q. & Amancio, Diego R., 2020. "Semantic flow in language networks discriminates texts by genre and publication date," Physica A: Statistical Mechanics and its Applications, Elsevier, vol. 557(C).
    4. Zhentao Liang & Jin Mao & Kun Lu & Gang Li, 2021. "Finding citations for PubMed: a large-scale comparison between five freely available bibliographic data sources," Scientometrics, Springer;Akadémiai Kiadó, vol. 126(12), pages 9519-9542, December.
    5. Thelwall, Mike, 2018. "Microsoft Academic automatic document searches: Accuracy for journal articles and suitability for citation analysis," Journal of Informetrics, Elsevier, vol. 12(1), pages 1-9.
    6. Anne-Wil Harzing, 2019. "Two new kids on the block: How do Crossref and Dimensions compare with Google Scholar, Microsoft Academic, Scopus and the Web of Science?," Scientometrics, Springer;Akadémiai Kiadó, vol. 120(1), pages 341-349, July.
    7. Adilson Vital & Diego R. Amancio, 2022. "A comparative analysis of local similarity metrics and machine learning approaches: application to link prediction in author citation networks," Scientometrics, Springer;Akadémiai Kiadó, vol. 127(10), pages 6011-6028, October.
    8. Ajiferuke, Isola & Lu, Kun & Wolfram, Dietmar, 2011. "Who are the research disciples of an author? Examining publication recitation and oeuvre citation exhaustivity," Journal of Informetrics, Elsevier, vol. 5(2), pages 292-302.
    9. Liu, Meijun & Jaiswal, Ajay & Bu, Yi & Min, Chao & Yang, Sijie & Liu, Zhibo & Acuña, Daniel & Ding, Ying, 2022. "Team formation and team impact: The balance between team freshness and repeat collaboration," Journal of Informetrics, Elsevier, vol. 16(4).
    10. Michael Thelwall, 2018. "Can Microsoft Academic be used for citation analysis of preprint archives? The case of the Social Science Research Network," Scientometrics, Springer;Akadémiai Kiadó, vol. 115(2), pages 913-928, May.
    11. Kun Lu & Isola Ajiferuke & Dietmar Wolfram, 2014. "Extending citer analysis to journal impact evaluation," Scientometrics, Springer;Akadémiai Kiadó, vol. 100(1), pages 245-260, July.
    12. Alberto Martín-Martín & Mike Thelwall & Enrique Orduna-Malea & Emilio Delgado López-Cózar, 2021. "Google Scholar, Microsoft Academic, Scopus, Dimensions, Web of Science, and OpenCitations’ COCI: a multidisciplinary comparison of coverage via citations," Scientometrics, Springer;Akadémiai Kiadó, vol. 126(1), pages 871-906, January.
    13. Giovanni Abramo & Ciriaco Andrea D’Angelo & Flavia Di Costa, 2019. "The collaboration behavior of top scientists," Scientometrics, Springer;Akadémiai Kiadó, vol. 118(1), pages 215-232, January.
    14. Aliakbar Akbaritabar & Niccolò Casnici & Flaminio Squazzoni, 2018. "The conundrum of research productivity: a study on sociologists in Italy," Scientometrics, Springer;Akadémiai Kiadó, vol. 114(3), pages 859-882, March.
    15. Marian-Gabriel Hâncean & Matjaž Perc & Jürgen Lerner, 2021. "The coauthorship networks of the most productive European researchers," Scientometrics, Springer;Akadémiai Kiadó, vol. 126(1), pages 201-224, January.
    16. Laurent R. Bergé, 2017. "Network proximity in the geography of research collaboration," Papers in Regional Science, Wiley Blackwell, vol. 96(4), pages 785-815, November.
    17. Abramo, Giovanni & D'Angelo, Ciriaco Andrea & Di Costa, Flavia, 2021. "The scholarly impact of private sector research: A multivariate analysis," Journal of Informetrics, Elsevier, vol. 15(3).
    18. Giovanni Abramo & Ciriaco Andrea D'Angelo & Flavia Di Costa, 2020. "The relative impact of private research on scientific advancement," Papers 2012.04908, arXiv.org.
    19. Zoltán Krajcsák, 2021. "Researcher Performance in Scopus Articles ( RPSA ) as a New Scientometric Model of Scientific Output: Tested in Business Area of V4 Countries," Publications, MDPI, vol. 9(4), pages 1-23, October.
    20. Franceschini, Fiorenzo & Maisano, Domenico, 2011. "Structured evaluation of the scientific output of academic research groups by recent h-based indicators," Journal of Informetrics, Elsevier, vol. 5(1), pages 64-74.

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:spr:scient:v:128:y:2023:i:4:d:10.1007_s11192-023-04669-7. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: Sonal Shukla or Springer Nature Abstracting and Indexing (email available below). General contact details of provider: http://www.springer.com .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.