IDEAS home Printed from https://ideas.repec.org/a/eee/infome/v11y2017i2p541-552.html
   My bibliography  Save this article

On evaluating the quality of a computer science/computer engineering conference

Author

Listed:
  • Loizides, Orestis-Stavros
  • Koutsakis, Polychronis

Abstract

The Peer Reputation (PR) metric was recently proposed in the literature, in order to judge a researcher’s contribution through the quality of the venue in which the researcher’s work is published. PR, proposed by Nelakuditi et al., ties the selectivity of a publication venue with the reputation of the first author’s institution. By computing PR for a percentage of the papers accepted in a conference or journal, a more solid indicator of a venue’s selectivity than the paper Acceptance Ratio (AR) can be derived. In recent work we explained the reasons for which we agree that PR offers substantial information that is missing from AR, however we also pointed out several limitations of the metric. These limitations make PR inadequate, if used only on its own, to give a solid evaluation of a researcher’s contribution. In this work, we present our own approach for judging the quality of a Computer Science/Computer Engineering conference venue, and thus, implicitly, the potential quality of a paper accepted in that conference. Driven by our previous findings on the adequacy of PR, as well as our belief that an institution does not necessarily “make” a researcher, we propose a Conference Classification Approach (CCA) that takes into account a number of metrics and factors, in addition to PR. These are the paper’s impact and the authors’ h-indexes. We present and discuss our results, based on data gathered from close to 3000 papers from 12 top-tier Computer Science/Computer Engineering conferences belonging to different research fields. In order to evaluate CCA, we compare our conference rankings against multiple publicly available rankings based on evaluations from the Computer Science/Computer Engineering community, and we show that our approach achieves a very comparable classification.

Suggested Citation

  • Loizides, Orestis-Stavros & Koutsakis, Polychronis, 2017. "On evaluating the quality of a computer science/computer engineering conference," Journal of Informetrics, Elsevier, vol. 11(2), pages 541-552.
  • Handle: RePEc:eee:infome:v:11:y:2017:i:2:p:541-552
    DOI: 10.1016/j.joi.2017.03.008
    as

    Download full text from publisher

    File URL: http://www.sciencedirect.com/science/article/pii/S1751157716301808
    Download Restriction: Full text for ScienceDirect subscribers only

    File URL: https://libkey.io/10.1016/j.joi.2017.03.008?utm_source=ideas
    LibKey link: if access is restricted and if your library uses this service, LibKey will redirect you to where you can use your library subscription to access this item
    ---><---

    As the access to this document is restricted, you may want to search for a different version of it.

    References listed on IDEAS

    as
    1. Abramo, Giovanni & D’Angelo, Ciriaco Andrea, 2016. "A farewell to the MNCS and like size-independent indicators," Journal of Informetrics, Elsevier, vol. 10(2), pages 646-651.
    2. Lokman I. Meho & Kiduk Yang, 2007. "Impact of data sources on citation counts and rankings of LIS faculty: Web of science versus scopus and google scholar," Journal of the American Society for Information Science and Technology, Association for Information Science & Technology, vol. 58(13), pages 2105-2125, November.
    3. Vinkler, Péter, 2012. "The case of scientometricians with the “absolute relative” impact indicator," Journal of Informetrics, Elsevier, vol. 6(2), pages 254-264.
    4. James Hartley, 2012. "To cite or not to cite: author self-citations and the impact factor," Scientometrics, Springer;Akadémiai Kiadó, vol. 92(2), pages 313-317, August.
    5. Wolfgang Glänzel & András Schubert, 2003. "A new classification scheme of science fields and subfields designed for scientometric evaluation purposes," Scientometrics, Springer;Akadémiai Kiadó, vol. 56(3), pages 357-367, March.
    6. Peep Küngas & Siim Karus & Svitlana Vakulenko & Marlon Dumas & Cristhian Parra & Fabio Casati, 2013. "Reverse-engineering conference rankings: what does it take to make a reputable conference?," Scientometrics, Springer;Akadémiai Kiadó, vol. 96(2), pages 651-665, August.
    7. Anne-Wil Harzing, 2014. "A longitudinal study of Google Scholar coverage between 2012 and 2013," Scientometrics, Springer;Akadémiai Kiadó, vol. 98(1), pages 565-575, January.
    8. Waltman, Ludo, 2016. "A review of the literature on citation impact indicators," Journal of Informetrics, Elsevier, vol. 10(2), pages 365-391.
    Full references (including those not matched with items on IDEAS)

    Citations

    Citations are extracted by the CitEc Project, subscribe to its RSS feed for this item.
    as


    Cited by:

    1. Danielle H. Lee, 2019. "Predictive power of conference-related factors on citation rates of conference papers," Scientometrics, Springer;Akadémiai Kiadó, vol. 118(1), pages 281-304, January.
    2. Xiancheng Li & Wenge Rong & Haoran Shi & Jie Tang & Zhang Xiong, 2018. "The impact of conference ranking systems in computer science: a comparative regression analysis," Scientometrics, Springer;Akadémiai Kiadó, vol. 116(2), pages 879-907, August.
    3. Meho, Lokman I., 2019. "Using Scopus’s CiteScore for assessing the quality of computer science conferences," Journal of Informetrics, Elsevier, vol. 13(1), pages 419-433.

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Waltman, Ludo, 2016. "A review of the literature on citation impact indicators," Journal of Informetrics, Elsevier, vol. 10(2), pages 365-391.
    2. Loet Leydesdorff & Paul Wouters & Lutz Bornmann, 2016. "Professional and citizen bibliometrics: complementarities and ambivalences in the development and use of indicators—a state-of-the-art report," Scientometrics, Springer;Akadémiai Kiadó, vol. 109(3), pages 2129-2150, December.
    3. Muhammad Salman & Mohammad Masroor Ahmed & Muhammad Tanvir Afzal, 2021. "Assessment of author ranking indices based on multi-authorship," Scientometrics, Springer;Akadémiai Kiadó, vol. 126(5), pages 4153-4172, May.
    4. Parul Khurana & Kiran Sharma, 2022. "Impact of h-index on author’s rankings: an improvement to the h-index for lower-ranked authors," Scientometrics, Springer;Akadémiai Kiadó, vol. 127(8), pages 4483-4498, August.
    5. Alireza Abbasi & Mahdi Jalili & Abolghasem Sadeghi-Niaraki, 2018. "Influence of network-based structural and power diversity on research performance," Scientometrics, Springer;Akadémiai Kiadó, vol. 117(1), pages 579-590, October.
    6. Danielle H. Lee, 2019. "Predictive power of conference-related factors on citation rates of conference papers," Scientometrics, Springer;Akadémiai Kiadó, vol. 118(1), pages 281-304, January.
    7. Mike Thelwall, 2019. "The influence of highly cited papers on field normalised indicators," Scientometrics, Springer;Akadémiai Kiadó, vol. 118(2), pages 519-537, February.
    8. Dag W. Aksnes & Liv Langfeldt & Paul Wouters, 2019. "Citations, Citation Indicators, and Research Quality: An Overview of Basic Concepts and Theories," SAGE Open, , vol. 9(1), pages 21582440198, February.
    9. Alona Zharova & Wolfgang K. Härdle & Stefan Lessmann, 2017. "Is Scientific Performance a Function of Funds?," SFB 649 Discussion Papers SFB649DP2017-028, Sonderforschungsbereich 649, Humboldt University, Berlin, Germany.
    10. Meho, Lokman I., 2019. "Using Scopus’s CiteScore for assessing the quality of computer science conferences," Journal of Informetrics, Elsevier, vol. 13(1), pages 419-433.
    11. Brito, Ricardo & Rodríguez-Navarro, Alonso, 2018. "Research assessment by percentile-based double rank analysis," Journal of Informetrics, Elsevier, vol. 12(1), pages 315-329.
    12. Lutz Bornmann & Klaus Wohlrabe, 2019. "Normalisation of citation impact in economics," Scientometrics, Springer;Akadémiai Kiadó, vol. 120(2), pages 841-884, August.
    13. Loet Leydesdorff, 2013. "An evaluation of impacts in “Nanoscience & nanotechnology”: steps towards standards for citation analysis," Scientometrics, Springer;Akadémiai Kiadó, vol. 94(1), pages 35-55, January.
    14. Enrique Orduna-Malea & Juan M. Ayllón & Alberto Martín-Martín & Emilio Delgado López-Cózar, 2015. "Methods for estimating the size of Google Scholar," Scientometrics, Springer;Akadémiai Kiadó, vol. 104(3), pages 931-949, September.
    15. Anne-Wil Harzing & Satu Alakangas, 2016. "Google Scholar, Scopus and the Web of Science: a longitudinal and cross-disciplinary comparison," Scientometrics, Springer;Akadémiai Kiadó, vol. 106(2), pages 787-804, February.
    16. Cesar H. Limaymanta & Rosalía Quiroz-de-García & Jesús A. Rivas-Villena & Andrea Rojas-Arroyo & Orlando Gregorio-Chaviano, 2022. "Relationship between collaboration and normalized scientific impact in South American public universities," Scientometrics, Springer;Akadémiai Kiadó, vol. 127(11), pages 6391-6411, November.
    17. Lorna Wildgaard, 2015. "A comparison of 17 author-level bibliometric indicators for researchers in Astronomy, Environmental Science, Philosophy and Public Health in Web of Science and Google Scholar," Scientometrics, Springer;Akadémiai Kiadó, vol. 104(3), pages 873-906, September.
    18. Moed, Henk F. & Bar-Ilan, Judit & Halevi, Gali, 2016. "A new methodology for comparing Google Scholar and Scopus," Journal of Informetrics, Elsevier, vol. 10(2), pages 533-551.
    19. Juan Miguel Campanario, 2018. "Are leaders really leading? Journals that are first in Web of Science subject categories in the context of their groups," Scientometrics, Springer;Akadémiai Kiadó, vol. 115(1), pages 111-130, April.
    20. Albarrán, Pedro & Herrero, Carmen & Ruiz-Castillo, Javier & Villar, Antonio, 2017. "The Herrero-Villar approach to citation impact," Journal of Informetrics, Elsevier, vol. 11(2), pages 625-640.

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:eee:infome:v:11:y:2017:i:2:p:541-552. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: Catherine Liu (email available below). General contact details of provider: http://www.elsevier.com/locate/joi .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.