IDEAS home Printed from https://ideas.repec.org/a/plo/pone00/0155097.html
   My bibliography  Save this article

Performance Benchmarks for Scholarly Metrics Associated with Fisheries and Wildlife Faculty

Author

Listed:
  • Robert K Swihart
  • Mekala Sundaram
  • Tomas O Höök
  • J Andrew DeWoody
  • Kenneth F Kellner

Abstract

Research productivity and impact are often considered in professional evaluations of academics, and performance metrics based on publications and citations increasingly are used in such evaluations. To promote evidence-based and informed use of these metrics, we collected publication and citation data for 437 tenure-track faculty members at 33 research-extensive universities in the United States belonging to the National Association of University Fisheries and Wildlife Programs. For each faculty member, we computed 8 commonly used performance metrics based on numbers of publications and citations, and recorded covariates including academic age (time since Ph.D.), sex, percentage of appointment devoted to research, and the sub-disciplinary research focus. Standardized deviance residuals from regression models were used to compare faculty after accounting for variation in performance due to these covariates. We also aggregated residuals to enable comparison across universities. Finally, we tested for temporal trends in citation practices to assess whether the “law of constant ratios”, used to enable comparison of performance metrics between disciplines that differ in citation and publication practices, applied to fisheries and wildlife sub-disciplines when mapped to Web of Science Journal Citation Report categories. Our regression models reduced deviance by ¼ to ½. Standardized residuals for each faculty member, when combined across metrics as a simple average or weighted via factor analysis, produced similar results in terms of performance based on percentile rankings. Significant variation was observed in scholarly performance across universities, after accounting for the influence of covariates. In contrast to findings for other disciplines, normalized citation ratios for fisheries and wildlife sub-disciplines increased across years. Increases were comparable for all sub-disciplines except ecology. We discuss the advantages and limitations of our methods, illustrate their use when applied to new data, and suggest future improvements. Our benchmarking approach may provide a useful tool to augment detailed, qualitative assessment of performance.

Suggested Citation

  • Robert K Swihart & Mekala Sundaram & Tomas O Höök & J Andrew DeWoody & Kenneth F Kellner, 2016. "Performance Benchmarks for Scholarly Metrics Associated with Fisheries and Wildlife Faculty," PLOS ONE, Public Library of Science, vol. 11(5), pages 1-16, May.
  • Handle: RePEc:plo:pone00:0155097
    DOI: 10.1371/journal.pone.0155097
    as

    Download full text from publisher

    File URL: https://journals.plos.org/plosone/article?id=10.1371/journal.pone.0155097
    Download Restriction: no

    File URL: https://journals.plos.org/plosone/article/file?id=10.1371/journal.pone.0155097&type=printable
    Download Restriction: no

    File URL: https://libkey.io/10.1371/journal.pone.0155097?utm_source=ideas
    LibKey link: if access is restricted and if your library uses this service, LibKey will redirect you to where you can use your library subscription to access this item
    ---><---

    References listed on IDEAS

    as
    1. Igor Podlubny, 2005. "Comparison of scientific impact expressed by the number of citations in different fields of science," Scientometrics, Springer;Akadémiai Kiadó, vol. 64(1), pages 95-99, July.
    2. A. M. Petersen & O. Penner & H. E. Stanley, 2011. "Methods for detrending success metrics to account for inflationary and deflationary factors," The European Physical Journal B: Condensed Matter and Complex Systems, Springer;EDP Sciences, vol. 79(1), pages 67-78, January.
    3. Lorna Wildgaard & Jesper W. Schneider & Birger Larsen, 2014. "A review of the characteristics of 108 author-level bibliometric indicators," Scientometrics, Springer;Akadémiai Kiadó, vol. 101(1), pages 125-158, October.
    4. Lutz Bornmann & Rüdiger Mutz & Hans-Dieter Daniel & Gerlind Wallon & Anna Ledin, 2009. "Are there really two types of h index variants? A validation study by using molecular life sciences data," Research Evaluation, Oxford University Press, vol. 18(3), pages 185-190, September.
    5. Lutz Bornmann & Gerlind Wallon & Anna Ledin, 2008. "Is the h index related to (standard) bibliometric measures and to the assessments by peers? An investigation of the h index by using molecular life sciences data," Research Evaluation, Oxford University Press, vol. 17(2), pages 149-156, June.
    6. Amin Mazloumian, 2012. "Predicting Scholars' Scientific Impact," PLOS ONE, Public Library of Science, vol. 7(11), pages 1-5, November.
    7. Salih Selek & Ayman Saleh, 2014. "Use of h index and g index for American academic psychiatry," Scientometrics, Springer;Akadémiai Kiadó, vol. 99(2), pages 541-548, May.
    8. C. O. S. Sorzano & J. Vargas & G. Caffarena-Fernández & A. Iriarte, 2014. "Comparing scientific performance among equals," Scientometrics, Springer;Akadémiai Kiadó, vol. 101(3), pages 1731-1745, December.
    9. Johannes Hönekopp & Julie Khan, 2012. "Future publication success in science is better predicted by traditional measures than by the h index," Scientometrics, Springer;Akadémiai Kiadó, vol. 90(3), pages 843-853, March.
    10. Nees Jan van Eck & Ludo Waltman & Anthony F J van Raan & Robert J M Klautz & Wilco C Peul, 2013. "Citation Analysis May Severely Underestimate the Impact of Clinical Research as Compared to Basic Research," PLOS ONE, Public Library of Science, vol. 8(4), pages 1-6, April.
    Full references (including those not matched with items on IDEAS)

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Bornmann, Lutz & Mutz, Rüdiger & Daniel, Hans-Dieter, 2010. "The h index research output measurement: Two approaches to enhance its accuracy," Journal of Informetrics, Elsevier, vol. 4(3), pages 407-414.
    2. Eduardo A. Oliveira & Enrico A. Colosimo & Daniella R. Martelli & Isabel G. Quirino & Maria Christina L. Oliveira & Leonardo S. Lima & Ana Cristina Simões e Silva & Hercílio Martelli-Júnior, 2012. "Comparison of Brazilian researchers in clinical medicine: are criteria for ranking well-adjusted?," Scientometrics, Springer;Akadémiai Kiadó, vol. 90(2), pages 429-443, February.
    3. Asma Hammami & Nabil Semmar, 2022. "The simplex simulation as a tool to reveal publication strategies and citation factors," Scientometrics, Springer;Akadémiai Kiadó, vol. 127(1), pages 319-350, January.
    4. Li Hou & Qiang Wu & Yundong Xie, 2022. "Does early publishing in top journals really predict long-term scientific success in the business field?," Scientometrics, Springer;Akadémiai Kiadó, vol. 127(11), pages 6083-6107, November.
    5. Alonso, S. & Cabrerizo, F.J. & Herrera-Viedma, E. & Herrera, F., 2009. "h-Index: A review focused in its variants, computation and standardization for different scientific fields," Journal of Informetrics, Elsevier, vol. 3(4), pages 273-289.
    6. Vîiu, Gabriel-Alexandru, 2016. "A theoretical evaluation of Hirsch-type bibliometric indicators confronted with extreme self-citation," Journal of Informetrics, Elsevier, vol. 10(2), pages 552-566.
    7. Deming Lin & Tianhui Gong & Wenbin Liu & Martin Meyer, 2020. "An entropy-based measure for the evolution of h index research," Scientometrics, Springer;Akadémiai Kiadó, vol. 125(3), pages 2283-2298, December.
    8. Hyeonchae Yang & Woo-Sung Jung, 2015. "A strategic management approach for Korean public research institutes based on bibliometric investigation," Quality & Quantity: International Journal of Methodology, Springer, vol. 49(4), pages 1437-1464, July.
    9. Justus Haucap & Johannes Muck, 2015. "What drives the relevance and reputation of economics journals? An update from a survey among economists," Scientometrics, Springer;Akadémiai Kiadó, vol. 103(3), pages 849-877, June.
    10. Zheng Yan & Wenqian Robertson & Yaosheng Lou & Tom W. Robertson & Sung Yong Park, 2021. "Finding leading scholars in mobile phone behavior: a mixed-method analysis of an emerging interdisciplinary field," Scientometrics, Springer;Akadémiai Kiadó, vol. 126(12), pages 9499-9517, December.
    11. Zhenbin Yan & Qiang Wu & Xingchen Li, 2016. "Do Hirsch-type indices behave the same in assessing single publications? An empirical study of 29 bibliometric indicators," Scientometrics, Springer;Akadémiai Kiadó, vol. 109(3), pages 1815-1833, December.
    12. Jiri Vanecek, 2008. "Bibliometric analysis of the Czech research publications from 1994 to 2005," Scientometrics, Springer;Akadémiai Kiadó, vol. 77(2), pages 345-360, November.
    13. Lathabai, Hiran H., 2020. "ψ-index: A new overall productivity index for actors of science and technology," Journal of Informetrics, Elsevier, vol. 14(4).
    14. Maziar Montazerian & Edgar Dutra Zanotto & Hellmut Eckert, 2019. "A new parameter for (normalized) evaluation of H-index: countries as a case study," Scientometrics, Springer;Akadémiai Kiadó, vol. 118(3), pages 1065-1078, March.
    15. Marcin Kozak & Lutz Bornmann, 2012. "A New Family of Cumulative Indexes for Measuring Scientific Performance," PLOS ONE, Public Library of Science, vol. 7(10), pages 1-4, October.
    16. Bárbara S. Lancho-Barrantes & Vicente P. Guerrero-Bote & Félix Moya-Anegón, 2010. "The iceberg hypothesis revisited," Scientometrics, Springer;Akadémiai Kiadó, vol. 85(2), pages 443-461, November.
    17. David I Stern, 2014. "High-Ranked Social Science Journal Articles Can Be Identified from Early Citation Information," PLOS ONE, Public Library of Science, vol. 9(11), pages 1-11, November.
    18. Jing Li & Qiushuang Long & Xiaoli Lu & Dengsheng Wu, 2023. "Citation beneficiaries of discipline-specific mega-journals: who and how much," Palgrave Communications, Palgrave Macmillan, vol. 10(1), pages 1-10, December.
    19. Georgios Stoupas & Antonis Sidiropoulos & Antonia Gogoglou & Dimitrios Katsaros & Yannis Manolopoulos, 2018. "Rainbow ranking: an adaptable, multidimensional ranking method for publication sets," Scientometrics, Springer;Akadémiai Kiadó, vol. 116(1), pages 147-160, July.
    20. Daniella B Deutz & Evgenios Vlachos & Dorte Drongstrup & Bertil F Dorch & Charlotte Wien, 2020. "Effective publication strategies in clinical research," PLOS ONE, Public Library of Science, vol. 15(1), pages 1-12, January.

    More about this item

    Statistics

    Access and download statistics

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:plo:pone00:0155097. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: plosone (email available below). General contact details of provider: https://journals.plos.org/plosone/ .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.