IDEAS home Printed from https://ideas.repec.org/a/eee/infome/v11y2017i3p788-799.html
   My bibliography  Save this article

Can the journal impact factor be used as a criterion for the selection of junior researchers? A large-scale empirical study based on ResearcherID data

Author

Listed:
  • Bornmann, Lutz
  • Williams, Richard

Abstract

Early in researchers’ careers, it is difficult to assess how good their work is or how important or influential the scholars will eventually be. Hence, funding agencies, academic departments, and others often use the Journal Impact Factor (JIF) of where the authors have published to assess their work and provide resources and rewards for future work. The use of JIFs in this way has been heavily criticized, however. Using a large data set with many thousands of publication profiles of individual researchers, this study tests the ability of the JIF (in its normalized variant) to identify, at the beginning of their careers, those candidates who will be successful in the long run. Instead of bare JIFs and citation counts, the metrics used here are standardized according to Web of Science subject categories and publication years. The results of the study indicate that the JIF (in its normalized variant) is able to discriminate between researchers who published papers later on with a citation impact above or below average in a field and publication year – not only in the short term, but also in the long term. However, the low to medium effect sizes of the results also indicate that the JIF (in its normalized variant) should not be used as the sole criterion for identifying later success: other criteria, such as the novelty and significance of the specific research, academic distinctions, and the reputation of previous institutions, should also be considered.

Suggested Citation

  • Bornmann, Lutz & Williams, Richard, 2017. "Can the journal impact factor be used as a criterion for the selection of junior researchers? A large-scale empirical study based on ResearcherID data," Journal of Informetrics, Elsevier, vol. 11(3), pages 788-799.
  • Handle: RePEc:eee:infome:v:11:y:2017:i:3:p:788-799
    DOI: 10.1016/j.joi.2017.06.001
    as

    Download full text from publisher

    File URL: http://www.sciencedirect.com/science/article/pii/S1751157717300378
    Download Restriction: Full text for ScienceDirect subscribers only

    File URL: https://libkey.io/10.1016/j.joi.2017.06.001?utm_source=ideas
    LibKey link: if access is restricted and if your library uses this service, LibKey will redirect you to where you can use your library subscription to access this item
    ---><---

    As the access to this document is restricted, you may want to search for a different version of it.

    References listed on IDEAS

    as
    1. Abramo, Giovanni & D’Angelo, Ciriaco Andrea, 2016. "A farewell to the MNCS and like size-independent indicators," Journal of Informetrics, Elsevier, vol. 10(2), pages 646-651.
    2. Lutz Bornmann & Gerlind Wallon & Anna Ledin, 2008. "Does the Committee Peer Review Select the Best Applicants for Funding? An Investigation of the Selection Process for Two European Molecular Biology Organization Programmes," PLOS ONE, Public Library of Science, vol. 3(10), pages 1-11, October.
    3. Lutz Bornmann, 2012. "The Hawthorne effect in journal peer review," Scientometrics, Springer;Akadémiai Kiadó, vol. 91(3), pages 857-862, June.
    4. Bornmann, Lutz & Marx, Werner, 2015. "Methods for the generation of normalized citation impact scores in bibliometrics: Which method best reflects the judgements of experts?," Journal of Informetrics, Elsevier, vol. 9(2), pages 408-418.
    5. Sarah Rijcke & Alexander Rushforth, 2015. "To intervene or not to intervene; is that the question? On the role of scientometrics in research evaluation," Journal of the Association for Information Science & Technology, Association for Information Science & Technology, vol. 66(9), pages 1954-1958, September.
    6. Ludo Waltman & Nees Jan Eck & Thed N. Leeuwen & Martijn S. Visser & Anthony F. J. Raan, 2011. "Towards a new crown indicator: an empirical analysis," Scientometrics, Springer;Akadémiai Kiadó, vol. 87(3), pages 467-481, June.
    7. Vincent Larivière & Rodrigo Costas, 2016. "How Many Is Too Many? On the Relationship between Research Productivity and Impact," PLOS ONE, Public Library of Science, vol. 11(9), pages 1-10, September.
    8. Williams, Richard & Bornmann, Lutz, 2016. "Sampling issues in bibliometric analysis," Journal of Informetrics, Elsevier, vol. 10(4), pages 1225-1232.
    9. Eugenie Samuel Reich, 2013. "Science publishing: The golden club," Nature, Nature, vol. 502(7471), pages 291-293, October.
    10. Sarah de Rijcke & Paul F. Wouters & Alex D. Rushforth & Thomas P. Franssen & Björn Hammarfelt, 2016. "Evaluation practices and effects of indicator use—a literature review," Research Evaluation, Oxford University Press, vol. 25(2), pages 161-169.
    11. Lutz Bornmann & Werner Marx, 2014. "How to evaluate individual researchers working in the natural and life sciences meaningfully? A proposal of methods based on percentiles of citations," Scientometrics, Springer;Akadémiai Kiadó, vol. 98(1), pages 487-509, January.
    12. John N. Parker & Stefano Allesina & Christopher J. Lortie, 2013. "Characterizing a scientific elite (B): publication and citation patterns of the most highly cited scientists in environmental science and ecology," Scientometrics, Springer;Akadémiai Kiadó, vol. 94(2), pages 469-480, February.
    13. Sandra Miguel & Zaida Chinchilla-Rodriguez & Félix de Moya-Anegón, 2011. "Open access and Scopus: A new approach to scientific visibility from the standpoint of access," Journal of the Association for Information Science & Technology, Association for Information Science & Technology, vol. 62(6), pages 1130-1145, June.
    14. Loet Leydesdorff & Paul Wouters & Lutz Bornmann, 2016. "Professional and citizen bibliometrics: complementarities and ambivalences in the development and use of indicators—a state-of-the-art report," Scientometrics, Springer;Akadémiai Kiadó, vol. 109(3), pages 2129-2150, December.
    15. Nabil Amara & Réjean Landry & Norrin Halilem, 2015. "What can university administrators do to increase the publication and citation scores of their faculty members?," Scientometrics, Springer;Akadémiai Kiadó, vol. 103(2), pages 489-530, May.
    16. Koski, Timo & Sandström, Erik & Sandström, Ulf, 2016. "Towards field-adjusted production: Estimating research productivity from a zero-truncated distribution," Journal of Informetrics, Elsevier, vol. 10(4), pages 1143-1152.
    17. Abramo, Giovanni & D’Angelo, Ciriaco Andrea & Soldatenkova, Anastasiia, 2017. "An investigation on the skewness patterns and fractal nature of research productivity distributions at field and discipline level," Journal of Informetrics, Elsevier, vol. 11(1), pages 324-335.
    18. Bornmann, Lutz & Stefaner, Moritz & de Moya Anegón, Felix & Mutz, Rüdiger, 2014. "What is the effect of country-specific characteristics on the research performance of scientific institutions? Using multi-level statistical models to rank and map universities and research-focused in," Journal of Informetrics, Elsevier, vol. 8(3), pages 581-593.
    19. Ruiz-Castillo, Javier & Costas, Rodrigo, 2014. "The skewness of scientific productivity," Journal of Informetrics, Elsevier, vol. 8(4), pages 917-934.
    20. Waltman, Ludo & van Eck, Nees Jan & van Leeuwen, Thed N. & Visser, Martijn S. & van Raan, Anthony F.J., 2011. "Towards a new crown indicator: Some theoretical considerations," Journal of Informetrics, Elsevier, vol. 5(1), pages 37-47.
    21. Andrea Diem & Stefan C. Wolter, 2011. "The Use of Bibliometrics to Measure Research Performance in Education Sciences," Economics of Education Working Paper Series 0066, University of Zurich, Department of Business Administration (IBW), revised May 2013.
    22. John P A Ioannidis & Kevin W Boyack & Richard Klavans, 2014. "Estimates of the Continuously Publishing Core in the Scientific Workforce," PLOS ONE, Public Library of Science, vol. 9(7), pages 1-10, July.
    23. Claveau, François, 2016. "There should not be any mystery: A comment on sampling issues in bibliometrics," Journal of Informetrics, Elsevier, vol. 10(4), pages 1233-1240.
    24. Per O. Seglen, 1992. "The skewness of science," Journal of the American Society for Information Science, Association for Information Science & Technology, vol. 43(9), pages 628-638, October.
    25. Weishu Liu & Guangyuan Hu & Mengdi Gu, 2016. "The probability of publishing in first-quartile journals," Scientometrics, Springer;Akadémiai Kiadó, vol. 106(3), pages 1273-1276, March.
    26. Giovanni Abramo & Ciriaco Andrea D'Angelo & Flavia Di Costa, 2010. "Testing the trade-off between productivity and quality in research activities," Journal of the Association for Information Science & Technology, Association for Information Science & Technology, vol. 61(1), pages 132-140, January.
    27. B Ian Hutchins & Xin Yuan & James M Anderson & George M Santangelo, 2016. "Relative Citation Ratio (RCR): A New Metric That Uses Citation Rates to Measure Influence at the Article Level," PLOS Biology, Public Library of Science, vol. 14(9), pages 1-25, September.
    28. Waltman, Ludo, 2016. "A review of the literature on citation impact indicators," Journal of Informetrics, Elsevier, vol. 10(2), pages 365-391.
    29. Giovanni Abramo & Ciriaco Andrea D'Angelo & Flavia Di Costa, 2010. "Testing the trade‐off between productivity and quality in research activities," Journal of the American Society for Information Science and Technology, Association for Information Science & Technology, vol. 61(1), pages 132-140, January.
    Full references (including those not matched with items on IDEAS)

    Citations

    Citations are extracted by the CitEc Project, subscribe to its RSS feed for this item.
    as


    Cited by:

    1. Bravo, Giangiacomo & Farjam, Mike & Grimaldo Moreno, Francisco & Birukou, Aliaksandr & Squazzoni, Flaminio, 2018. "Hidden connections: Network effects on editorial decisions in four computer science journals," Journal of Informetrics, Elsevier, vol. 12(1), pages 101-112.
    2. Lin Zhang & Yuanyuan Shang & Ying Huang & Gunnar Sivertsen, 2022. "Gender differences among active reviewers: an investigation based on publons," Scientometrics, Springer;Akadémiai Kiadó, vol. 127(1), pages 145-179, January.
    3. Zhang, Lin & Shang, Yuanyuan & HUANG, Ying & Sivertsen, Gunnar, 2021. "Gender differences among active reviewers: an investigation based on Publons," SocArXiv 4z6w8, Center for Open Science.
    4. Yves Fassin, 2021. "Does the Financial Times FT50 journal list select the best management and economics journals?," Scientometrics, Springer;Akadémiai Kiadó, vol. 126(7), pages 5911-5943, July.
    5. Xipeng Liu & Xinmiao Li, 2024. "Unbiased evaluation of ranking algorithms applied to the Chinese green patents citation network," Scientometrics, Springer;Akadémiai Kiadó, vol. 129(6), pages 2999-3021, June.
    6. Pooyan Makvandi & Anahita Nodehi & Franklin R. Tay, 2021. "Conference Accreditation and Need of a Bibliometric Measure to Distinguish Predatory Conferences," Publications, MDPI, vol. 9(2), pages 1-5, April.
    7. Danielle H. Lee, 2019. "Predicting the research performance of early career scientists," Scientometrics, Springer;Akadémiai Kiadó, vol. 121(3), pages 1481-1504, December.
    8. Lindahl, Jonas, 2018. "Predicting research excellence at the individual level: The importance of publication rate, top journal publications, and top 10% publications in the case of early career mathematicians," Journal of Informetrics, Elsevier, vol. 12(2), pages 518-533.
    9. Mingyang Wang & Shijia Jiao & Kah-Hin Chai & Guangsheng Chen, 2019. "Building journal’s long-term impact: using indicators detected from the sustained active articles," Scientometrics, Springer;Akadémiai Kiadó, vol. 121(1), pages 261-283, October.
    10. Li Hou & Qiang Wu & Yundong Xie, 2022. "Does early publishing in top journals really predict long-term scientific success in the business field?," Scientometrics, Springer;Akadémiai Kiadó, vol. 127(11), pages 6083-6107, November.
    11. Torres-Salinas, Daniel & Valderrama-Baca, Pilar & Arroyo-Machado, Wenceslao, 2022. "Is there a need for a new journal metric? Correlations between JCR Impact Factor metrics and the Journal Citation Indicator—JCI," Journal of Informetrics, Elsevier, vol. 16(3).
    12. Tóth, Tamás & Demeter, Márton & Csuhai, Sándor & Major, Zsolt Balázs, 2024. "When career-boosting is on the line: Equity and inequality in grant evaluation, productivity, and the educational backgrounds of Marie Skłodowska-Curie Actions individual fellows in social sciences an," Journal of Informetrics, Elsevier, vol. 18(2).
    13. Guangchao Charles Feng, 2020. "Research Performance Evaluation in China: A Big Data Analysis," SAGE Open, , vol. 10(1), pages 21582440199, January.
    14. Christopher Zou & Julia Tsui & Jordan B. Peterson, 2018. "The publication trajectory of graduate students, post-doctoral fellows, and new professors in psychology," Scientometrics, Springer;Akadémiai Kiadó, vol. 117(2), pages 1289-1310, November.
    15. Lutz Bornmann & Julian N. Marewski, 2019. "Heuristics as conceptual lens for understanding and studying the usage of bibliometrics in research evaluation," Scientometrics, Springer;Akadémiai Kiadó, vol. 120(2), pages 419-459, August.
    16. Claudiu Vasile Kifor & Ana Maria Benedek & Ioan Sîrbu & Roxana Florența Săvescu, 2023. "Institutional drivers of research productivity: a canonical multivariate analysis of Romanian public universities," Scientometrics, Springer;Akadémiai Kiadó, vol. 128(4), pages 2233-2258, April.
    17. Xipeng Liu & Xinmiao Li, 2022. "Early Identification of Significant Patents Using Heterogeneous Applicant-Citation Networks Based on the Chinese Green Patent Data," Sustainability, MDPI, vol. 14(21), pages 1-27, October.
    18. Chen, Shiji & Qiu, Junping & Arsenault, Clément & Larivière, Vincent, 2021. "Exploring the interdisciplinarity patterns of highly cited papers," Journal of Informetrics, Elsevier, vol. 15(1).
    19. Zsolt Kohus & Márton Demeter & László Kun & Eszter Lukács & Katalin Czakó & Gyula Péter Szigeti, 2022. "A Study of the Relation between Byline Positions of Affiliated/Non-Affiliated Authors and the Scientific Impact of European Universities in Times Higher Education World University Rankings," Sustainability, MDPI, vol. 14(20), pages 1-14, October.
    20. Batista-Jr, Antônio de Abreu & Gouveia, Fábio Castro & Mena-Chalco, Jesús P., 2021. "Predicting the Q of junior researchers using data from the first years of publication," Journal of Informetrics, Elsevier, vol. 15(2).
    21. Loet Leydesdorff & Lutz Bornmann & Jonathan Adams, 2019. "The integrated impact indicator revisited (I3*): a non-parametric alternative to the journal impact factor," Scientometrics, Springer;Akadémiai Kiadó, vol. 119(3), pages 1669-1694, June.
    22. Ruben Miranda & Esther Garcia-Carpintero, 2019. "Comparison of the share of documents and citations from different quartile journals in 25 research areas," Scientometrics, Springer;Akadémiai Kiadó, vol. 121(1), pages 479-501, October.
    23. Jonas Lindahl & Cristian Colliander & Rickard Danell, 2020. "Early career performance and its correlation with gender and publication output during doctoral education," Scientometrics, Springer;Akadémiai Kiadó, vol. 122(1), pages 309-330, January.
    24. Brito, Ricardo & Rodríguez-Navarro, Alonso, 2019. "Evaluating research and researchers by the journal impact factor: Is it better than coin flipping?," Journal of Informetrics, Elsevier, vol. 13(1), pages 314-324.
    25. Pilar Valderrama & Manuel Escabias & Evaristo Jiménez-Contreras & Alberto Rodríguez-Archilla & Mariano J. Valderrama, 2018. "Proposal of a stochastic model to determine the bibliometric variables influencing the quality of a journal: application to the field of Dentistry," Scientometrics, Springer;Akadémiai Kiadó, vol. 115(2), pages 1087-1095, May.

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Bornmann, Lutz & Haunschild, Robin & Mutz, Rüdiger, 2020. "Should citations be field-normalized in evaluative bibliometrics? An empirical analysis based on propensity score matching," Journal of Informetrics, Elsevier, vol. 14(4).
    2. Bornmann, Lutz & Marx, Werner, 2018. "Critical rationalism and the search for standard (field-normalized) indicators in bibliometrics," Journal of Informetrics, Elsevier, vol. 12(3), pages 598-604.
    3. Lutz Bornmann & Klaus Wohlrabe, 2019. "Normalisation of citation impact in economics," Scientometrics, Springer;Akadémiai Kiadó, vol. 120(2), pages 841-884, August.
    4. Vîiu, Gabriel-Alexandru, 2017. "Disaggregated research evaluation through median-based characteristic scores and scales: a comparison with the mean-based approach," Journal of Informetrics, Elsevier, vol. 11(3), pages 748-765.
    5. Dunaiski, Marcel & Geldenhuys, Jaco & Visser, Willem, 2019. "On the interplay between normalisation, bias, and performance of paper impact metrics," Journal of Informetrics, Elsevier, vol. 13(1), pages 270-290.
    6. Dunaiski, Marcel & Geldenhuys, Jaco & Visser, Willem, 2019. "Globalised vs averaged: Bias and ranking performance on the author level," Journal of Informetrics, Elsevier, vol. 13(1), pages 299-313.
    7. Eugenio Petrovich, 2022. "Bibliometrics in Press. Representations and uses of bibliometric indicators in the Italian daily newspapers," Scientometrics, Springer;Akadémiai Kiadó, vol. 127(5), pages 2195-2233, May.
    8. Bornmann, Lutz & Leydesdorff, Loet, 2017. "Skewness of citation impact data and covariates of citation distributions: A large-scale empirical analysis based on Web of Science data," Journal of Informetrics, Elsevier, vol. 11(1), pages 164-175.
    9. Lutz Bornmann & Alexander Tekles & Loet Leydesdorff, 2019. "How well does I3 perform for impact measurement compared to other bibliometric indicators? The convergent validity of several (field-normalized) indicators," Scientometrics, Springer;Akadémiai Kiadó, vol. 119(2), pages 1187-1205, May.
    10. Lutz Bornmann & Julian N. Marewski, 2019. "Heuristics as conceptual lens for understanding and studying the usage of bibliometrics in research evaluation," Scientometrics, Springer;Akadémiai Kiadó, vol. 120(2), pages 419-459, August.
    11. Abramo, Giovanni & D'Angelo, CiriacoAndrea & Di Costa, Flavia, 2024. "The moderating role of personal characteristics of authors in the publications’ quality for quantity trade-off," Journal of Informetrics, Elsevier, vol. 18(1).
    12. Abramo, Giovanni & D’Angelo, Ciriaco Andrea, 2016. "A comparison of university performance scores and ranks by MNCS and FSS," Journal of Informetrics, Elsevier, vol. 10(4), pages 889-901.
    13. Bornmann, Lutz & Ganser, Christian & Tekles, Alexander, 2022. "Simulation of the h index use at university departments within the bibliometrics-based heuristics framework: Can the indicator be used to compare individual researchers?," Journal of Informetrics, Elsevier, vol. 16(1).
    14. Dag W. Aksnes & Liv Langfeldt & Paul Wouters, 2019. "Citations, Citation Indicators, and Research Quality: An Overview of Basic Concepts and Theories," SAGE Open, , vol. 9(1), pages 21582440198, February.
    15. Haunschild, Robin & Bornmann, Lutz, 2016. "Normalization of Mendeley reader counts for impact assessment," Journal of Informetrics, Elsevier, vol. 10(1), pages 62-73.
    16. Thelwall, Mike & Fairclough, Ruth, 2017. "The accuracy of confidence intervals for field normalised indicators," Journal of Informetrics, Elsevier, vol. 11(2), pages 530-540.
    17. Bornmann, Lutz & Haunschild, Robin, 2018. "Normalization of zero-inflated data: An empirical analysis of a new indicator family and its use with altmetrics data," Journal of Informetrics, Elsevier, vol. 12(3), pages 998-1011.
    18. Raminta Pranckutė, 2021. "Web of Science (WoS) and Scopus: The Titans of Bibliographic Information in Today’s Academic World," Publications, MDPI, vol. 9(1), pages 1-59, March.
    19. Gabriel-Alexandru Vȋiu & Mihai Păunescu, 2021. "The lack of meaningful boundary differences between journal impact factor quartiles undermines their independent use in research evaluation," Scientometrics, Springer;Akadémiai Kiadó, vol. 126(2), pages 1495-1525, February.
    20. Loet Leydesdorff & Paul Wouters & Lutz Bornmann, 2016. "Professional and citizen bibliometrics: complementarities and ambivalences in the development and use of indicators—a state-of-the-art report," Scientometrics, Springer;Akadémiai Kiadó, vol. 109(3), pages 2129-2150, December.

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:eee:infome:v:11:y:2017:i:3:p:788-799. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: Catherine Liu (email available below). General contact details of provider: http://www.elsevier.com/locate/joi .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.