IDEAS home Printed from https://ideas.repec.org/a/spr/scient/v102y2015i1d10.1007_s11192-014-1436-y.html
   My bibliography  Save this article

Ranking computer science conferences using self-organizing maps with dynamic node splitting

Author

Listed:
  • Vinicius da Silva Almendra

    (University of Bucharest)

  • Denis Enăchescu

    (University of Bucharest
    “Gheorghe Mihoc - Caius Iacob” Institute for Mathematical Statistics and Applied Mathematics)

  • Cornelia Enăchescu

    (“Gheorghe Mihoc - Caius Iacob” Institute for Mathematical Statistics and Applied Mathematics)

Abstract

Research dissemination in the Computer Science domain depends heavily on conference publications. The review processes of major conferences is rigorous and the work presented in those venues have more visibility and more citations than many journals, with the advantage of a faster dissemination of ideas. We consider that any evaluation system in the Computer Science domain must take into account conferences as having the same importance as journals. This makes the evaluation of venues an important issue. While journals are usually evaluated through their Impact Factor, there is no widely accepted method for evaluating conferences. In our work we analyzed the possibility of using Machine learning techniques to extend an existing ranking to new conferences, based on a set of measurements that are available for the majority of venues. Our proposal consists on the application of a Machine learning technique—self-organizing maps—with some extensions in order to classify new conferences based on an existing ranking. We also try to estimate the theoretical maximal accuracy that can be obtained using statistical learning techniques.

Suggested Citation

  • Vinicius da Silva Almendra & Denis Enăchescu & Cornelia Enăchescu, 2015. "Ranking computer science conferences using self-organizing maps with dynamic node splitting," Scientometrics, Springer;Akadémiai Kiadó, vol. 102(1), pages 267-283, January.
  • Handle: RePEc:spr:scient:v:102:y:2015:i:1:d:10.1007_s11192-014-1436-y
    DOI: 10.1007/s11192-014-1436-y
    as

    Download full text from publisher

    File URL: http://link.springer.com/10.1007/s11192-014-1436-y
    File Function: Abstract
    Download Restriction: Access to the full text of the articles in this series is restricted.

    File URL: https://libkey.io/10.1007/s11192-014-1436-y?utm_source=ideas
    LibKey link: if access is restricted and if your library uses this service, LibKey will redirect you to where you can use your library subscription to access this item
    ---><---

    As the access to this document is restricted, you may want to search for a different version of it.

    References listed on IDEAS

    as
    1. Waister Silva Martins & Marcos André Gonçalves & Alberto H. F. Laender & Nivio Ziviani, 2010. "Assessing the quality of scientific conferences based on bibliographic citations," Scientometrics, Springer;Akadémiai Kiadó, vol. 83(1), pages 133-155, April.
    2. Peep Küngas & Siim Karus & Svitlana Vakulenko & Marlon Dumas & Cristhian Parra & Fabio Casati, 2013. "Reverse-engineering conference rankings: what does it take to make a reputable conference?," Scientometrics, Springer;Akadémiai Kiadó, vol. 96(2), pages 651-665, August.
    3. Linda Butler, 2008. "ICT assessment: Moving beyond journal outputs," Scientometrics, Springer;Akadémiai Kiadó, vol. 74(1), pages 39-55, January.
    Full references (including those not matched with items on IDEAS)

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Meho, Lokman I., 2019. "Using Scopus’s CiteScore for assessing the quality of computer science conferences," Journal of Informetrics, Elsevier, vol. 13(1), pages 419-433.
    2. Xiancheng Li & Wenge Rong & Haoran Shi & Jie Tang & Zhang Xiong, 2018. "The impact of conference ranking systems in computer science: a comparative regression analysis," Scientometrics, Springer;Akadémiai Kiadó, vol. 116(2), pages 879-907, August.
    3. Željko Stević & Irena Đalić & Dragan Pamučar & Zdravko Nunić & Slavko Vesković & Marko Vasiljević & Ilija Tanackov, 2019. "A new hybrid model for quality assessment of scientific conferences based on Rough BWM and SERVQUAL," Scientometrics, Springer;Akadémiai Kiadó, vol. 119(1), pages 1-30, April.
    4. D. R. Amancio & M. G. V. Nunes & O. N. Oliveira & L. F. Costa, 2012. "Using complex networks concepts to assess approaches for citations in scientific papers," Scientometrics, Springer;Akadémiai Kiadó, vol. 91(3), pages 827-842, June.
    5. Loizides, Orestis-Stavros & Koutsakis, Polychronis, 2017. "On evaluating the quality of a computer science/computer engineering conference," Journal of Informetrics, Elsevier, vol. 11(2), pages 541-552.
    6. Peter Ingwersen & Birger Larsen & J. Carlos Garcia-Zorita & Antonio Eleazar Serrano-López & Elias Sanz-Casado, 2014. "Influence of proceedings papers on citation impact in seven sub-fields of sustainable energy research 2005–2011," Scientometrics, Springer;Akadémiai Kiadó, vol. 101(2), pages 1273-1292, November.
    7. Nicholas Yee Liang Hing & Xin Ci Wong & Pei Xuan Kuan & Mohan Dass Pathmanathan & Mohd Aizuddin Abdul Rahman & Kalaiarasu M. Peariasamy, 2022. "Scientific Abstract to Full Paper: Publication Rate over a 3-Year Period in a Malaysian Clinical Research Conference," Publications, MDPI, vol. 10(4), pages 1-13, October.
    8. Peder Olesen Larsen & Markus Ins, 2010. "The rate of growth in scientific publication and the decline in coverage provided by Science Citation Index," Scientometrics, Springer;Akadémiai Kiadó, vol. 84(3), pages 575-603, September.
    9. Michael Eckmann & Anderson Rocha & Jacques Wainer, 2012. "Relationship between high-quality journals and conferences in computer vision," Scientometrics, Springer;Akadémiai Kiadó, vol. 90(2), pages 617-630, February.
    10. Cosma, Simona & Rimo, Giuseppe, 2024. "Redefining insurance through technology: Achievements and perspectives in Insurtech," Research in International Business and Finance, Elsevier, vol. 70(PA).
    11. Danielle H. Lee, 2019. "Predictive power of conference-related factors on citation rates of conference papers," Scientometrics, Springer;Akadémiai Kiadó, vol. 118(1), pages 281-304, January.
    12. Matthew Harsh & Ravtosh Bal & Alex Weryha & Justin Whatley & Charles C. Onu & Lisa M. Negro, 2021. "Mapping computer science research in Africa: using academic networking sites for assessing research activity," Scientometrics, Springer;Akadémiai Kiadó, vol. 126(1), pages 305-334, January.
    13. Peter Ingwersen & Birger Larsen, 2014. "Influence of a performance indicator on Danish research production and citation impact 2000–12," Scientometrics, Springer;Akadémiai Kiadó, vol. 101(2), pages 1325-1344, November.
    14. Stefano Breschi & Franco Malerba, 2011. "Assessing the scientific and technological output of EU Framework Programmes: evidence from the FP6 projects in the ICT field," Scientometrics, Springer;Akadémiai Kiadó, vol. 88(1), pages 239-257, July.
    15. Peep Küngas & Siim Karus & Svitlana Vakulenko & Marlon Dumas & Cristhian Parra & Fabio Casati, 2013. "Reverse-engineering conference rankings: what does it take to make a reputable conference?," Scientometrics, Springer;Akadémiai Kiadó, vol. 96(2), pages 651-665, August.
    16. Regina Negri Pagani & Bruno Pedroso & Celso Bilynkievycz Santos & Claudia Tania Picinin & João Luiz Kovaleski, 2023. "Methodi Ordinatio 2.0: revisited under statistical estimation, and presenting FInder and RankIn," Quality & Quantity: International Journal of Methodology, Springer, vol. 57(5), pages 4563-4602, October.
    17. Polat, Zeynel Abidin & Alkan, Mehmet & Paulsson, Jenny & Paasch, Jesper M. & Kalogianni, Eftychia, 2022. "Global scientific production on LADM-based research: A bibliometric analysis from 2012 to 2020," Land Use Policy, Elsevier, vol. 112(C).
    18. Zhang, Lin & Glänzel, Wolfgang, 2012. "Proceeding papers in journals versus the “regular” journal publications," Journal of Informetrics, Elsevier, vol. 6(1), pages 88-96.
    19. Sicilia, Miguel-Angel & Sánchez-Alonso, Salvador & García-Barriocanal, Elena, 2011. "Comparing impact factors from two different citation databases: The case of Computer Science," Journal of Informetrics, Elsevier, vol. 5(4), pages 698-704.

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:spr:scient:v:102:y:2015:i:1:d:10.1007_s11192-014-1436-y. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: Sonal Shukla or Springer Nature Abstracting and Indexing (email available below). General contact details of provider: http://www.springer.com .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.