IDEAS home Printed from https://ideas.repec.org/a/spr/scient/v128y2023i8d10.1007_s11192-023-04733-2.html
   My bibliography  Save this article

An ablation study on the use of publication venue quality to rank computer science departments

Author

Listed:
  • Aniruddha Maiti

    (Temple University)

  • Sai Shi

    (Temple University)

  • Slobodan Vucetic

    (Temple University)

Abstract

This paper focuses on ranking computer science departments based on the quality of publications by the faculty in those departments. There are multiple strategies to convert publication lists into ranking scores for the departments. Important open questions include handling multi-author publications, inclusion criteria for publications and publication venues, accounting for the quality of publication venues, and accounting for the sub-areas of computer science. An ablation study is performed to evaluate the importance of different decisions for department ranking. The correlation between the resulting rankings and the peer assessment of computer science departments provided by the U.S. News was measured to evaluate the importance of different decisions. The results show that the selection of publication venues has the highest impact on the ranking. In contrast, decisions related to publication recency, multi-author publications, and clustering publications into subareas have less impact. Overall, Pearson’s correlation coefficient between the publication-based scores and the U.S. News ranking is above 0.90 for a large range of decisions, indicating a strong agreement between the objective measure and the subjective opinion of peers.

Suggested Citation

  • Aniruddha Maiti & Sai Shi & Slobodan Vucetic, 2023. "An ablation study on the use of publication venue quality to rank computer science departments," Scientometrics, Springer;Akadémiai Kiadó, vol. 128(8), pages 4197-4218, August.
  • Handle: RePEc:spr:scient:v:128:y:2023:i:8:d:10.1007_s11192-023-04733-2
    DOI: 10.1007/s11192-023-04733-2
    as

    Download full text from publisher

    File URL: http://link.springer.com/10.1007/s11192-023-04733-2
    File Function: Abstract
    Download Restriction: Access to the full text of the articles in this series is restricted.

    File URL: https://libkey.io/10.1007/s11192-023-04733-2?utm_source=ideas
    LibKey link: if access is restricted and if your library uses this service, LibKey will redirect you to where you can use your library subscription to access this item
    ---><---

    As the access to this document is restricted, you may want to search for a different version of it.

    References listed on IDEAS

    as
    1. Leo Egghe, 2006. "Theory and practise of the g-index," Scientometrics, Springer;Akadémiai Kiadó, vol. 69(1), pages 131-152, October.
    2. Judit Bar-Ilan, 2010. "Web of Science with the Conference Proceedings Citation Indexes: the case of computer science," Scientometrics, Springer;Akadémiai Kiadó, vol. 83(3), pages 809-824, June.
    3. Jaime A. Teixeira da Silva & Aamir Raoof Memon, 2017. "CiteScore: A cite for sore eyes, or a valuable, transparent metric?," Scientometrics, Springer;Akadémiai Kiadó, vol. 111(1), pages 553-556, April.
    4. Masaru Kuno & Mary Prorok & Shubin Zhang & Huy Huynh & Thurston Miller, 2022. "Deciphering the US News and World Report Ranking of US Chemistry Graduate Programs," Scientometrics, Springer;Akadémiai Kiadó, vol. 127(5), pages 2131-2150, May.
    5. Declan Butler, 2011. "Computing giants launch free science metrics," Nature, Nature, vol. 476(7358), pages 18-18, August.
    6. Michael J. Kurtz & Edwin A. Henneken, 2017. "Measuring metrics - a 40-year longitudinal cross-validation of citations, downloads, and peer review in astrophysics," Journal of the Association for Information Science & Technology, Association for Information Science & Technology, vol. 68(3), pages 695-708, March.
    7. Themis Lazaridis, 2010. "Ranking university departments using the mean h-index," Scientometrics, Springer;Akadémiai Kiadó, vol. 82(2), pages 211-216, February.
    Full references (including those not matched with items on IDEAS)

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Shahryar Rahnamayan & Sedigheh Mahdavi & Kalyanmoy Deb & Azam Asilian Bidgoli, 2020. "Ranking Multi-Metric Scientific Achievements Using a Concept of Pareto Optimality," Mathematics, MDPI, vol. 8(6), pages 1-46, June.
    2. Nikolaos A. Kazakis, 2014. "Bibliometric evaluation of the research performance of the Greek civil engineering departments in National and European context," Scientometrics, Springer;Akadémiai Kiadó, vol. 101(1), pages 505-525, October.
    3. Chao Lu & Ying Ding & Chengzhi Zhang, 2017. "Understanding the impact change of a highly cited article: a content-based citation analysis," Scientometrics, Springer;Akadémiai Kiadó, vol. 112(2), pages 927-945, August.
    4. Stelios Katranidis & Theodore Panagiotidis & Costas Zontanos, 2014. "An Evaluation Of The Greek Universities’ Economics Departments," Bulletin of Economic Research, Wiley Blackwell, vol. 66(2), pages 173-182, April.
    5. Xie, Qing & Zhang, Xinyuan & Song, Min, 2021. "A network embedding-based scholar assessment indicator considering four facets: Research topic, author credit allocation, field-normalized journal impact, and published time," Journal of Informetrics, Elsevier, vol. 15(4).
    6. Hu, Beibei & Ding, Yang & Dong, Xianlei & Bu, Yi & Ding, Ying, 2021. "On the relationship between download and citation counts: An introduction of Granger-causality inference," Journal of Informetrics, Elsevier, vol. 15(2).
    7. Kuan, Chung-Huei & Huang, Mu-Hsuan & Chen, Dar-Zen, 2013. "Cross-field evaluation of publications of research institutes using their contributions to the fields’ MVPs determined by h-index," Journal of Informetrics, Elsevier, vol. 7(2), pages 455-468.
    8. Waltman, Ludo, 2016. "A review of the literature on citation impact indicators," Journal of Informetrics, Elsevier, vol. 10(2), pages 365-391.
    9. Nikolaos A. Kazakis & Anastasios D. Diamantidis & Leonidas L. Fragidis & Miltos K. Lazarides, 2014. "Evaluating the research performance of the Greek medical schools using bibliometrics," Scientometrics, Springer;Akadémiai Kiadó, vol. 98(2), pages 1367-1384, February.
    10. Cao, Huiying & Gao, Chao & Wang, Zhen, 2023. "Ranking academic institutions by means of institution–publication networks," Physica A: Statistical Mechanics and its Applications, Elsevier, vol. 629(C).
    11. Petr Praus, 2019. "High-ranked citations percentage as an indicator of publications quality," Scientometrics, Springer;Akadémiai Kiadó, vol. 120(1), pages 319-329, July.
    12. Nikolaos A. Kazakis, 2015. "The research activity of the current faculty of the Greek chemical engineering departments: a bibliometric study in national and international context," Scientometrics, Springer;Akadémiai Kiadó, vol. 103(1), pages 229-250, April.
    13. Giovanni Abramo & Ciriaco Andrea D’Angelo & Fulvio Viel, 2013. "The suitability of h and g indexes for measuring the research performance of institutions," Scientometrics, Springer;Akadémiai Kiadó, vol. 97(3), pages 555-570, December.
    14. Vîiu, Gabriel-Alexandru, 2016. "A theoretical evaluation of Hirsch-type bibliometric indicators confronted with extreme self-citation," Journal of Informetrics, Elsevier, vol. 10(2), pages 552-566.
    15. Deming Lin & Tianhui Gong & Wenbin Liu & Martin Meyer, 2020. "An entropy-based measure for the evolution of h index research," Scientometrics, Springer;Akadémiai Kiadó, vol. 125(3), pages 2283-2298, December.
    16. Aurelia Magdalena Pisoschi & Claudia Gabriela Pisoschi, 2016. "Is open access the solution to increase the impact of scientific journals?," Scientometrics, Springer;Akadémiai Kiadó, vol. 109(2), pages 1075-1095, November.
    17. Gaviria-Marin, Magaly & Merigó, José M. & Baier-Fuentes, Hugo, 2019. "Knowledge management: A global examination based on bibliometric analysis," Technological Forecasting and Social Change, Elsevier, vol. 140(C), pages 194-220.
    18. Kaur, Jasleen & Radicchi, Filippo & Menczer, Filippo, 2013. "Universality of scholarly impact metrics," Journal of Informetrics, Elsevier, vol. 7(4), pages 924-932.
    19. Vinayak, & Raghuvanshi, Adarsh & kshitij, Avinash, 2023. "Signatures of capacity development through research collaborations in artificial intelligence and machine learning," Journal of Informetrics, Elsevier, vol. 17(1).
    20. Masaru Kuno & Mary Prorok & Shubin Zhang & Huy Huynh & Thurston Miller, 2022. "Deciphering the US News and World Report Ranking of US Chemistry Graduate Programs," Scientometrics, Springer;Akadémiai Kiadó, vol. 127(5), pages 2131-2150, May.

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:spr:scient:v:128:y:2023:i:8:d:10.1007_s11192-023-04733-2. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: Sonal Shukla or Springer Nature Abstracting and Indexing (email available below). General contact details of provider: http://www.springer.com .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.