IDEAS home Printed from https://ideas.repec.org/a/spr/scient/v113y2017i2d10.1007_s11192-017-2501-0.html
   My bibliography  Save this article

A quantitative analysis of researcher citation personal display considering disciplinary differences and influence factors

Author

Listed:
  • Xingchen Li

    (University of Science and Technology of China)

  • Qiang Wu

    (University of Science and Technology of China)

  • Yuanyuan Liu

    (University of Science and Technology of China)

Abstract

Personal websites are a good place not only for the scientists to show a wealth of content, but also for researchers to excavate some useful information related to quantitative evaluation. Based on researchers’ personal websites this study aims to investigate the degree of citation personal display (CPD) in three major disciplines (chemistry, mathematics, and physics), as well as disciplinary differences in CPD. This paper also studies the factors which have influences on CPD by using binary logistic regression. The datasets studied consisted of 5771 researchers in 39 U.S. universities. Results show that CPD varies significantly by discipline, with chemistry researchers having the highest CPD (15.3%), followed by physics researchers (12.7%), and mathematics researchers (7.1%). The binary logistic models indicate that total citations, h-index, and citations per publication have significantly positive effects on CPD in chemistry; for mathematics, total citations and h-index do; and for physics, only total citations does (p

Suggested Citation

  • Xingchen Li & Qiang Wu & Yuanyuan Liu, 2017. "A quantitative analysis of researcher citation personal display considering disciplinary differences and influence factors," Scientometrics, Springer;Akadémiai Kiadó, vol. 113(2), pages 1093-1112, November.
  • Handle: RePEc:spr:scient:v:113:y:2017:i:2:d:10.1007_s11192-017-2501-0
    DOI: 10.1007/s11192-017-2501-0
    as

    Download full text from publisher

    File URL: http://link.springer.com/10.1007/s11192-017-2501-0
    File Function: Abstract
    Download Restriction: Access to the full text of the articles in this series is restricted.

    File URL: https://libkey.io/10.1007/s11192-017-2501-0?utm_source=ideas
    LibKey link: if access is restricted and if your library uses this service, LibKey will redirect you to where you can use your library subscription to access this item
    ---><---

    As the access to this document is restricted, you may want to search for a different version of it.

    References listed on IDEAS

    as
    1. Vincent Larivière & Yves Gingras, 2010. "On the relationship between interdisciplinarity and scientific impact," Journal of the Association for Information Science & Technology, Association for Information Science & Technology, vol. 61(1), pages 126-131, January.
    2. Eva Lillquist & Sheldon Green, 2010. "The discipline dependence of citation statistics," Scientometrics, Springer;Akadémiai Kiadó, vol. 84(3), pages 749-762, September.
    3. Vincent Larivière & Yves Gingras, 2010. "On the relationship between interdisciplinarity and scientific impact," Journal of the American Society for Information Science and Technology, Association for Information Science & Technology, vol. 61(1), pages 126-131, January.
    4. Philip Ball, 2005. "Index aims for fair ranking of scientists," Nature, Nature, vol. 436(7053), pages 900-900, August.
    5. Amalia Mas-Bleda & Mike Thelwall & Kayvan Kousha & Isidro F. Aguillo, 2014. "Do highly cited researchers successfully use the social web?," Scientometrics, Springer;Akadémiai Kiadó, vol. 101(1), pages 337-356, October.
    6. Bornmann, Lutz & Williams, Richard, 2013. "How to calculate the practical significance of citation impact differences? An empirical example from evaluative institutional bibliometrics using adjusted predictions and marginal effects," Journal of Informetrics, Elsevier, vol. 7(2), pages 562-574.
    7. Rodrigo Costas & Thed N. van Leeuwen & María Bordons, 2010. "A bibliometric classificatory approach for the study and assessment of research performance at the individual level: The effects of age on productivity and impact," Journal of the Association for Information Science & Technology, Association for Information Science & Technology, vol. 61(8), pages 1564-1581, August.
    8. Lorna Wildgaard & Jesper W. Schneider & Birger Larsen, 2014. "A review of the characteristics of 108 author-level bibliometric indicators," Scientometrics, Springer;Akadémiai Kiadó, vol. 101(1), pages 125-158, October.
    9. Chung Joo Chung & Han Woo Park, 2012. "Web visibility of scholars in media and communication journals," Scientometrics, Springer;Akadémiai Kiadó, vol. 93(1), pages 207-215, October.
    10. Rodrigo Costas & Thed N. van Leeuwen & María Bordons, 2010. "A bibliometric classificatory approach for the study and assessment of research performance at the individual level: The effects of age on productivity and impact," Journal of the American Society for Information Science and Technology, Association for Information Science & Technology, vol. 61(8), pages 1564-1581, August.
    11. Jacob B. Slyder & Beth R. Stein & Brent S. Sams & David M. Walker & B. Jacob Beale & Jeffrey J. Feldhaus & Carolyn A. Copenheaver, 2011. "Citation pattern and lifespan: a comparison of discipline, institution, and individual," Scientometrics, Springer;Akadémiai Kiadó, vol. 89(3), pages 955-966, December.
    12. Pablo Jensen & Jean-Baptiste Rouquier & Yves Croissant, 2009. "Testing bibliometric indicators by their prediction of scientists promotions," Scientometrics, Springer;Akadémiai Kiadó, vol. 78(3), pages 467-479, March.
    13. Bornmann, Lutz & Stefaner, Moritz & de Moya Anegón, Felix & Mutz, Rüdiger, 2014. "What is the effect of country-specific characteristics on the research performance of scientific institutions? Using multi-level statistical models to rank and map universities and research-focused in," Journal of Informetrics, Elsevier, vol. 8(3), pages 581-593.
    14. Javier Espadas & Coral Calero & Mario Piattini, 2008. "Web site visibility evaluation," Journal of the American Society for Information Science and Technology, Association for Information Science & Technology, vol. 59(11), pages 1727-1742, September.
    15. Kayvan Kousha & Mike Thelwall, 2014. "Disseminating research with web CV hyperlinks," Journal of the Association for Information Science & Technology, Association for Information Science & Technology, vol. 65(8), pages 1615-1626, August.
    Full references (including those not matched with items on IDEAS)

    Citations

    Citations are extracted by the CitEc Project, subscribe to its RSS feed for this item.
    as


    Cited by:

    1. Pilar Valderrama & Manuel Escabias & Evaristo Jiménez-Contreras & Mariano J. Valderrama & Pilar Baca, 2018. "A mixed longitudinal and cross-sectional model to forecast the journal impact factor in the field of Dentistry," Scientometrics, Springer;Akadémiai Kiadó, vol. 116(2), pages 1203-1212, August.
    2. Balázs Győrffy & Andrea Magda Nagy & Péter Herman & Ádám Török, 2018. "Factors influencing the scientific performance of Momentum grant holders: an evaluation of the first 117 research groups," Scientometrics, Springer;Akadémiai Kiadó, vol. 117(1), pages 409-426, October.

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Zheng Yan & Wenqian Robertson & Yaosheng Lou & Tom W. Robertson & Sung Yong Park, 2021. "Finding leading scholars in mobile phone behavior: a mixed-method analysis of an emerging interdisciplinary field," Scientometrics, Springer;Akadémiai Kiadó, vol. 126(12), pages 9499-9517, December.
    2. Bornmann, Lutz & Haunschild, Robin & Mutz, Rüdiger, 2020. "Should citations be field-normalized in evaluative bibliometrics? An empirical analysis based on propensity score matching," Journal of Informetrics, Elsevier, vol. 14(4).
    3. Vincent Larivière & Rodrigo Costas, 2016. "How Many Is Too Many? On the Relationship between Research Productivity and Impact," PLOS ONE, Public Library of Science, vol. 11(9), pages 1-10, September.
    4. Rodrigo Costas & Thomas Franssen, 2018. "Reflections around ‘the cautionary use’ of the h-index: response to Teixeira da Silva and Dobránszki," Scientometrics, Springer;Akadémiai Kiadó, vol. 115(2), pages 1125-1130, May.
    5. Peter van den Besselaar & Ulf Sandström, 2019. "Measuring researcher independence using bibliometric data: A proposal for a new performance indicator," PLOS ONE, Public Library of Science, vol. 14(3), pages 1-20, March.
    6. Fan, Lingxu & Guo, Lei & Wang, Xinhua & Xu, Liancheng & Liu, Fangai, 2022. "Does the author’s collaboration mode lead to papers’ different citation impacts? An empirical analysis based on propensity score matching," Journal of Informetrics, Elsevier, vol. 16(4).
    7. Sixto-Costoya Andrea & Robinson-Garcia Nicolas & Leeuwen Thed & Costas Rodrigo, 2021. "Exploring the relevance of ORCID as a source of study of data sharing activities at the individual-level: a methodological discussion," Scientometrics, Springer;Akadémiai Kiadó, vol. 126(8), pages 7149-7165, August.
    8. Vîiu, Gabriel-Alexandru, 2017. "Disaggregated research evaluation through median-based characteristic scores and scales: a comparison with the mean-based approach," Journal of Informetrics, Elsevier, vol. 11(3), pages 748-765.
    9. Xiaozan Lyu & Rodrigo Costas, 2021. "Studying the characteristics of scientific communities using individual-level bibliometrics: the case of Big Data research," Scientometrics, Springer;Akadémiai Kiadó, vol. 126(8), pages 6965-6987, August.
    10. Marek Kwiek & Wojciech Roszka, 2022. "Academic vs. biological age in research on academic careers: a large-scale study with implications for scientifically developing systems," Scientometrics, Springer;Akadémiai Kiadó, vol. 127(6), pages 3543-3575, June.
    11. Hackett, Edward J. & Leahey, Erin & Parker, John N. & Rafols, Ismael & Hampton, Stephanie E. & Corte, Ugo & Chavarro, Diego & Drake, John M. & Penders, Bart & Sheble, Laura & Vermeulen, Niki & Vision,, 2021. "Do synthesis centers synthesize? A semantic analysis of topical diversity in research," Research Policy, Elsevier, vol. 50(1).
    12. Xian Li & Ronald Rousseau & Liming Liang & Fangjie Xi & Yushuang Lü & Yifan Yuan & Xiaojun Hu, 2022. "Is low interdisciplinarity of references an unexpected characteristic of Nobel Prize winning research?," Scientometrics, Springer;Akadémiai Kiadó, vol. 127(4), pages 2105-2122, April.
    13. Rafols, Ismael & Leydesdorff, Loet & O’Hare, Alice & Nightingale, Paul & Stirling, Andy, 2012. "How journal rankings can suppress interdisciplinary research: A comparison between Innovation Studies and Business & Management," Research Policy, Elsevier, vol. 41(7), pages 1262-1282.
    14. Rodrigo Costas & María Bordons, 2011. "Do age and professional rank influence the order of authorship in scientific publications? Some evidence from a micro-level perspective," Scientometrics, Springer;Akadémiai Kiadó, vol. 88(1), pages 145-161, July.
    15. van Rijnsoever, Frank J. & Hessels, Laurens K., 2011. "Factors associated with disciplinary and interdisciplinary research collaboration," Research Policy, Elsevier, vol. 40(3), pages 463-472, April.
    16. Lathabai, Hiran H., 2020. "ψ-index: A new overall productivity index for actors of science and technology," Journal of Informetrics, Elsevier, vol. 14(4).
    17. Claus-Christian Carbon, 2011. "The Carbon_h-Factor: Predicting Individuals' Research Impact at Early Stages of Their Career," PLOS ONE, Public Library of Science, vol. 6(12), pages 1-7, December.
    18. Lawson, Cornelia & Soós,Sándor, 2014. "A Thematic Mobility Measure for Econometric Analysis," Department of Economics and Statistics Cognetti de Martiis. Working Papers 201408, University of Turin.
    19. Lutz Bornmann & Werner Marx, 2014. "How to evaluate individual researchers working in the natural and life sciences meaningfully? A proposal of methods based on percentiles of citations," Scientometrics, Springer;Akadémiai Kiadó, vol. 98(1), pages 487-509, January.
    20. Wildgaard, Lorna, 2016. "A critical cluster analysis of 44 indicators of author-level performance," Journal of Informetrics, Elsevier, vol. 10(4), pages 1055-1078.

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:spr:scient:v:113:y:2017:i:2:d:10.1007_s11192-017-2501-0. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: Sonal Shukla or Springer Nature Abstracting and Indexing (email available below). General contact details of provider: http://www.springer.com .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.