IDEAS home Printed from https://ideas.repec.org/a/spr/scient/v112y2017i2d10.1007_s11192-017-2419-6.html
   My bibliography  Save this article

Quantity versus impact of software engineering papers: a quantitative study

Author

Listed:
  • Vahid Garousi

    (University of Luxembourg
    Hacettepe University)

  • João M. Fernandes

    (University of Minho)

Abstract

According to the data from the Scopus publication database, as analyzed in several recent studies, more than 70,000 papers have been published in the area of Software Engineering (SE) since late 1960’s. According to our recent work, 43% of those papers have received no citations at all. Since citations are the most commonly used metric for measuring research (academic) impact, these figures raise questions (doubts) about the (non-existing) impact of such a large set of papers. It is a reality that typical academic reward systems encourage researchers to publish more papers and do not place a major emphasis on research impact. To shed light on the issue of volume (quantity) versus citation-based impact of SE research papers, we conduct and report in this paper a quantitative bibliometrics assessment in four aspects: (1) quantity versus impact of different paper types (e.g., conference versus journal papers), (2) ratios of uncited (non-impactful) papers, (3) quantity versus impact of papers originating from different countries, and (4) quantity versus impact of papers by each of the top-10 authors (in terms of number of papers). To achieve the above objective, we conducted a quantitative exploratory bibliometrics assessment, comprised of four research questions, to assess quantity versus impact of SE papers with respect to the aspects discussed above. We extracted the data through a systematic, automated and repeatable process from the Scopus paper database, which we also used in two previous papers. Our results show that the distribution of SE publications has a major inequality in terms of impact overall, and also when categorized in terms of the above four aspects. The situation in the SE literature is similar to the other areas of science as studied by previous bibliometrics studies. Also, among our results is the fact that journal articles and conference papers have been cited 12.6 and 3.6 times on average, confirming the expectation that journal articles have more impact, in general, than conference papers. Also, papers originated from English-speaking countries have in general more visibility and impact (and consequently citations) when compared to papers originated from non-English-speaking countries. Our results have implications for improvement of academic reward systems, which nowadays mainly encourage researchers to publish more papers and usually neglect research impact. Also, our results can help researchers in non-English-speaking countries to consider improvements to increase their research impact of their upcoming papers.

Suggested Citation

  • Vahid Garousi & João M. Fernandes, 2017. "Quantity versus impact of software engineering papers: a quantitative study," Scientometrics, Springer;Akadémiai Kiadó, vol. 112(2), pages 963-1006, August.
  • Handle: RePEc:spr:scient:v:112:y:2017:i:2:d:10.1007_s11192-017-2419-6
    DOI: 10.1007/s11192-017-2419-6
    as

    Download full text from publisher

    File URL: http://link.springer.com/10.1007/s11192-017-2419-6
    File Function: Abstract
    Download Restriction: Access to the full text of the articles in this series is restricted.

    File URL: https://libkey.io/10.1007/s11192-017-2419-6?utm_source=ideas
    LibKey link: if access is restricted and if your library uses this service, LibKey will redirect you to where you can use your library subscription to access this item
    ---><---

    As the access to this document is restricted, you may want to search for a different version of it.

    References listed on IDEAS

    as
    1. Daniel Sarewitz, 2016. "The pressure to publish pushes down quality," Nature, Nature, vol. 533(7602), pages 147-147, May.
    2. Peter A. Lawrence, 2003. "The politics of publication," Nature, Nature, vol. 422(6929), pages 259-261, March.
    3. Lutz Bornmann, 2014. "How are excellent (highly cited) papers defined in bibliometrics? A quantitative analysis of the literature," Research Evaluation, Oxford University Press, vol. 23(2), pages 166-173.
    4. John Mingers & Evangelia A. E. C. G. Lipitakis, 2010. "Counting the citations: a comparison of Web of Science and Google Scholar in the field of business and management," Scientometrics, Springer;Akadémiai Kiadó, vol. 85(2), pages 613-625, November.
    5. Olle Persson, 2010. "Are highly cited papers more international?," Scientometrics, Springer;Akadémiai Kiadó, vol. 83(2), pages 397-401, May.
    6. Brett Buttliere & Jürgen Buder, 2017. "Personalizing papers using Altmetrics: comparing paper ‘Quality’ or ‘Impact’ to person ‘Intelligence’ or ‘Personality’," Scientometrics, Springer;Akadémiai Kiadó, vol. 111(1), pages 219-239, April.
    7. Richard Van Noorden & Brendan Maher & Regina Nuzzo, 2014. "The top 100 papers," Nature, Nature, vol. 514(7524), pages 550-553, October.
    8. Rickard Danell, 2011. "Can the quality of scientific work be predicted using information on the author's track record?," Journal of the American Society for Information Science and Technology, Association for Information Science & Technology, vol. 62(1), pages 50-60, January.
    9. A. Abrizah & A. N. Zainab & K. Kiran & R. G. Raj, 2013. "LIS journals scientific impact and subject categorization: a comparison between Web of Science and Scopus," Scientometrics, Springer;Akadémiai Kiadó, vol. 94(2), pages 721-740, February.
    10. Heather Piwowar, 2013. "Value all research products," Nature, Nature, vol. 493(7431), pages 159-159, January.
    11. Éric Archambault & David Campbell & Yves Gingras & Vincent Larivière, 2009. "Comparing bibliometric statistics obtained from the Web of Science and Scopus," Journal of the American Society for Information Science and Technology, Association for Information Science & Technology, vol. 60(7), pages 1320-1326, July.
    12. George Vrettas & Mark Sanderson, 2015. "Conferences versus journals in computer science," Journal of the Association for Information Science & Technology, Association for Information Science & Technology, vol. 66(12), pages 2674-2684, December.
    13. João M. Fernandes, 2014. "Authorship trends in software engineering," Scientometrics, Springer;Akadémiai Kiadó, vol. 101(1), pages 257-271, October.
    14. Robert J. W. Tijssen & Martijn S. Visser & Thed N. van Leeuwen, 2002. "Benchmarking international scientific excellence: Are highly cited research papers an appropriate frame of reference?," Scientometrics, Springer;Akadémiai Kiadó, vol. 54(3), pages 381-397, July.
    15. Abramo, Giovanni & Cicero, Tindaro & D’Angelo, Ciriaco Andrea, 2014. "Are the authors of highly cited articles also the most productive ones?," Journal of Informetrics, Elsevier, vol. 8(1), pages 89-97.
    16. Aghaei Chadegani, Arezoo & Salehi, Hadi & Md Yunus, Melor & Farhadi, Hadi & Fooladi, Masood & Farhadi, Maryam & Ale Ebrahim, Nader, 2013. "A Comparison between Two Main Academic Literature Collections: Web of Science and Scopus Databases," MPRA Paper 46898, University Library of Munich, Germany, revised 18 Mar 2013.
    17. Thomas A. Hamrick & Ronald D. Fricker & Gerald G. Brown, 2010. "Assessing What Distinguishes Highly Cited from Less-Cited Papers Published in Interfaces," Interfaces, INFORMS, vol. 40(6), pages 454-464, December.
    18. Vahid Garousi, 2015. "A bibliometric analysis of the Turkish software engineering research community," Scientometrics, Springer;Akadémiai Kiadó, vol. 105(1), pages 23-49, October.
    19. Gianmarco Paris & Giulio De Leo & Paolo Menozzi & Marino Gatto, 1998. "Region-based citation bias in science," Nature, Nature, vol. 396(6708), pages 210-210, November.
    20. Anne-Wil Harzing & Satu Alakangas, 2016. "Google Scholar, Scopus and the Web of Science: a longitudinal and cross-disciplinary comparison," Scientometrics, Springer;Akadémiai Kiadó, vol. 106(2), pages 787-804, February.
    21. Dag W Aksnes, 2003. "Characteristics of highly cited papers," Research Evaluation, Oxford University Press, vol. 12(3), pages 159-170, December.
    22. Rickard Danell, 2011. "Can the quality of scientific work be predicted using information on the author's track record?," Journal of the Association for Information Science & Technology, Association for Information Science & Technology, vol. 62(1), pages 50-60, January.
    23. David A. King, 2004. "The scientific impact of nations," Nature, Nature, vol. 430(6997), pages 311-316, July.
    24. Mingyang Wang & Guang Yu & Daren Yu, 2011. "Mining typical features for highly cited papers," Scientometrics, Springer;Akadémiai Kiadó, vol. 87(3), pages 695-706, June.
    Full references (including those not matched with items on IDEAS)

    Citations

    Citations are extracted by the CitEc Project, subscribe to its RSS feed for this item.
    as


    Cited by:

    1. Sergey Kolesnikov & Eriko Fukumoto & Barry Bozeman, 2018. "Researchers’ risk-smoothing publication strategies: Is productivity the enemy of impact?," Scientometrics, Springer;Akadémiai Kiadó, vol. 116(3), pages 1995-2017, September.
    2. Kai Petersen & Nauman Bin Ali, 2021. "An analysis of top author citations in software engineering and a comparison with other fields," Scientometrics, Springer;Akadémiai Kiadó, vol. 126(11), pages 9147-9183, November.
    3. Carolina Navarro-Lopez & Salvador Linares-Mustaros & Carles Mulet-Forteza, 2022. "“The Statistical Analysis of Compositional Data†by John Aitchison (1986): A Bibliometric Overview," SAGE Open, , vol. 12(2), pages 21582440221, April.
    4. Jefferson Seide Molléri & Kai Petersen & Emilia Mendes, 2018. "Towards understanding the relation between citations and research quality in software engineering studies," Scientometrics, Springer;Akadémiai Kiadó, vol. 117(3), pages 1453-1478, December.
    5. Isabel Basson & Jaco P. Blanckenberg & Heidi Prozesky, 2021. "Do open access journal articles experience a citation advantage? Results and methodological reflections of an application of multiple measures to an analysis by WoS subject areas," Scientometrics, Springer;Akadémiai Kiadó, vol. 126(1), pages 459-484, January.
    6. Andrey E. Guskov & Denis V. Kosyakov & Irina V. Selivanova, 2018. "Boosting research productivity in top Russian universities: the circumstances of breakthrough," Scientometrics, Springer;Akadémiai Kiadó, vol. 117(2), pages 1053-1080, November.
    7. Meho, Lokman I., 2019. "Using Scopus’s CiteScore for assessing the quality of computer science conferences," Journal of Informetrics, Elsevier, vol. 13(1), pages 419-433.

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Waltman, Ludo, 2016. "A review of the literature on citation impact indicators," Journal of Informetrics, Elsevier, vol. 10(2), pages 365-391.
    2. Tamara Krajna & Jelka Petrak, 2019. "Croatian Highly Cited Papers," Interdisciplinary Description of Complex Systems - scientific journal, Croatian Interdisciplinary Society Provider Homepage: http://indecs.eu, vol. 17(3-B), pages 684-696.
    3. Nobuko Miyairi & Han-Wen Chang, 2012. "Bibliometric characteristics of highly cited papers from Taiwan, 2000–2009," Scientometrics, Springer;Akadémiai Kiadó, vol. 92(1), pages 197-205, July.
    4. Tian Yu & Guang Yu & Peng-Yu Li & Liang Wang, 2014. "Citation impact prediction for scientific papers using stepwise regression analysis," Scientometrics, Springer;Akadémiai Kiadó, vol. 101(2), pages 1233-1252, November.
    5. Jonas Lindahl & Cristian Colliander & Rickard Danell, 2020. "Early career performance and its correlation with gender and publication output during doctoral education," Scientometrics, Springer;Akadémiai Kiadó, vol. 122(1), pages 309-330, January.
    6. Liang, Liming & Zhong, Zhen & Rousseau, Ronald, 2015. "Uncited papers, uncited authors and uncited topics: A case study in library and information science," Journal of Informetrics, Elsevier, vol. 9(1), pages 50-58.
    7. Jefferson Seide Molléri & Kai Petersen & Emilia Mendes, 2018. "Towards understanding the relation between citations and research quality in software engineering studies," Scientometrics, Springer;Akadémiai Kiadó, vol. 117(3), pages 1453-1478, December.
    8. Mingyang Wang & Guang Yu & Shuang An & Daren Yu, 2012. "Discovery of factors influencing citation impact based on a soft fuzzy rough set model," Scientometrics, Springer;Akadémiai Kiadó, vol. 93(3), pages 635-644, December.
    9. Lindahl, Jonas, 2018. "Predicting research excellence at the individual level: The importance of publication rate, top journal publications, and top 10% publications in the case of early career mathematicians," Journal of Informetrics, Elsevier, vol. 12(2), pages 518-533.
    10. Wang, Mingyang & Yu, Guang & Xu, Jianzhong & He, Huixin & Yu, Daren & An, Shuang, 2012. "Development a case-based classifier for predicting highly cited papers," Journal of Informetrics, Elsevier, vol. 6(4), pages 586-599.
    11. Feiheng Luo & Aixin Sun & Mojisola Erdt & Aravind Sesagiri Raamkumar & Yin-Leng Theng, 2018. "Exploring prestigious citations sourced from top universities in bibliometrics and altmetrics: a case study in the computer science discipline," Scientometrics, Springer;Akadémiai Kiadó, vol. 114(1), pages 1-17, January.
    12. Li, Xin & Wen, Yang & Jiang, Jiaojiao & Daim, Tugrul & Huang, Lucheng, 2022. "Identifying potential breakthrough research: A machine learning method using scientific papers and Twitter data," Technological Forecasting and Social Change, Elsevier, vol. 184(C).
    13. Vivek Kumar Singh & Prashasti Singh & Mousumi Karmakar & Jacqueline Leta & Philipp Mayr, 2021. "The journal coverage of Web of Science, Scopus and Dimensions: A comparative analysis," Scientometrics, Springer;Akadémiai Kiadó, vol. 126(6), pages 5113-5142, June.
    14. Lutz Bornmann & Werner Marx, 2014. "How to evaluate individual researchers working in the natural and life sciences meaningfully? A proposal of methods based on percentiles of citations," Scientometrics, Springer;Akadémiai Kiadó, vol. 98(1), pages 487-509, January.
    15. Alba Santa Soriano & Carolina Lorenzo Álvarez & Rosa María Torres Valdés, 2018. "Bibliometric analysis to identify an emerging research area: Public Relations Intelligence—a challenge to strengthen technological observatories in the network society," Scientometrics, Springer;Akadémiai Kiadó, vol. 115(3), pages 1591-1614, June.
    16. Ping Zhou & Yongfeng Zhong & Meigen Yu, 2013. "A bibliometric investigation on China–UK collaboration in food and agriculture," Scientometrics, Springer;Akadémiai Kiadó, vol. 97(2), pages 267-285, November.
    17. Bar-Ilan, Judit, 2008. "Informetrics at the beginning of the 21st century—A review," Journal of Informetrics, Elsevier, vol. 2(1), pages 1-52.
    18. Wanjun Xia & Tianrui Li & Chongshou Li, 2023. "A review of scientific impact prediction: tasks, features and methods," Scientometrics, Springer;Akadémiai Kiadó, vol. 128(1), pages 543-585, January.
    19. Cathaysa Martín-Blanco & Montserrat Zamorano & Carmen Lizárraga & Valentin Molina-Moreno, 2022. "The Impact of COVID-19 on the Sustainable Development Goals: Achievements and Expectations," IJERPH, MDPI, vol. 19(23), pages 1-25, December.
    20. Houcemeddine Turki & Mohamed Ali Hadj Taieb & Mohamed Ben Aouicha & Ajith Abraham, 2020. "Nature or Science: what Google Trends says," Scientometrics, Springer;Akadémiai Kiadó, vol. 124(2), pages 1367-1385, August.

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:spr:scient:v:112:y:2017:i:2:d:10.1007_s11192-017-2419-6. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: Sonal Shukla or Springer Nature Abstracting and Indexing (email available below). General contact details of provider: http://www.springer.com .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.