IDEAS home Printed from https://ideas.repec.org/a/oup/rseval/v25y2016i3p264-270..html
   My bibliography  Save this article

Using Google Scholar in research evaluation of humanities and social science programs: A comparison with Web of Science data

Author

Listed:
  • Ad A.M. Prins
  • Rodrigo Costas
  • Thed N. van Leeuwen
  • Paul F. Wouters

Abstract

In this paper, we report on the application of Google Scholar (GS)-based metrics in the formal assessment of research programs. Involved were programs in the fields of Education, Pedagogical Sciences, and Anthropology in The Netherlands. Also, a comparative analysis has been conducted of the results based on GS and Web of Science (WoS). Studies critical of GS point at its reliability of data. We show how the reliability of the GS data for the bibliometric analysis of the assessment can be improved by excluding non-verifiable citing sources from the full second-order GS citing data. The study of the background of these second-order sources demonstrates a broadening of the citing sources. The comparison of GS with WoS citations for the publications of the programs shows that it is promising to use GS for fields with lower degrees of coverage in WoS, in particular for fields that produce more diverse types of output than just research articles. Restrictions to the use of GS are the intensive manual data handling and cleaning, necessary for a feasible and proper data collection. We discuss wider implications of the findings for bibliometric analysis and for the practices and policies in research evaluation.

Suggested Citation

  • Ad A.M. Prins & Rodrigo Costas & Thed N. van Leeuwen & Paul F. Wouters, 2016. "Using Google Scholar in research evaluation of humanities and social science programs: A comparison with Web of Science data," Research Evaluation, Oxford University Press, vol. 25(3), pages 264-270.
  • Handle: RePEc:oup:rseval:v:25:y:2016:i:3:p:264-270.
    as

    Download full text from publisher

    File URL: http://hdl.handle.net/10.1093/reseval/rvv049
    Download Restriction: Access to full text is restricted to subscribers.
    ---><---

    As the access to this document is restricted, you may want to search for a different version of it.

    Citations

    Citations are extracted by the CitEc Project, subscribe to its RSS feed for this item.
    as


    Cited by:

    1. Diogo Da Fonseca-Soares & Josicleda Domiciano Galvinicio & Sayonara Andrade Eliziário & Angel Fermin Ramos-Ridao, 2022. "A Bibliometric Analysis of the Trends and Characteristics of Railway Research," Sustainability, MDPI, vol. 14(21), pages 1-19, October.
    2. Guy Madison & Knut Sundell, 2022. "Numbers of publications and citations for researchers in fields pertinent to the social services: a comparison of peer-reviewed journal publications across six disciplines," Scientometrics, Springer;Akadémiai Kiadó, vol. 127(10), pages 6029-6046, October.
    3. Alessandro Margherita & Gianluca Elia & Claudio Petti, 2022. "What Is Quality in Research? Building a Framework of Design, Process and Impact Attributes and Evaluation Perspectives," Sustainability, MDPI, vol. 14(5), pages 1-18, March.
    4. Michael Gusenbauer, 2019. "Google Scholar to overshadow them all? Comparing the sizes of 12 academic search engines and bibliographic databases," Scientometrics, Springer;Akadémiai Kiadó, vol. 118(1), pages 177-214, January.
    5. Nadia Simoes & Nuno Crespo, 2020. "A flexible approach for measuring author-level publishing performance," Scientometrics, Springer;Akadémiai Kiadó, vol. 122(1), pages 331-355, January.
    6. Marcelo Mendoza, 2021. "Differences in Citation Patterns across Areas, Article Types and Age Groups of Researchers," Publications, MDPI, vol. 9(4), pages 1-23, October.
    7. Sven E. Hug & Martin P. Brändle, 2017. "The coverage of Microsoft Academic: analyzing the publication output of a university," Scientometrics, Springer;Akadémiai Kiadó, vol. 113(3), pages 1551-1571, December.
    8. Loet Leydesdorff & Paul Wouters & Lutz Bornmann, 2016. "Professional and citizen bibliometrics: complementarities and ambivalences in the development and use of indicators—a state-of-the-art report," Scientometrics, Springer;Akadémiai Kiadó, vol. 109(3), pages 2129-2150, December.
    9. Siviwe Bangani, 2018. "The impact of electronic theses and dissertations: a study of the institutional repository of a university in South Africa," Scientometrics, Springer;Akadémiai Kiadó, vol. 115(1), pages 131-151, April.
    10. John Mingers & Martin Meyer, 2017. "Normalizing Google Scholar data for use in research evaluation," Scientometrics, Springer;Akadémiai Kiadó, vol. 112(2), pages 1111-1121, August.
    11. Korytkowski, Przemyslaw & Kulczycki, Emanuel, 2019. "Publication counting methods for a national research evaluation exercise," Journal of Informetrics, Elsevier, vol. 13(3), pages 804-816.
    12. Azagra-Caro, Joaquín M. & Barberá-Tomás, David & Edwards-Schachter, Mónica & Tur, Elena M., 2017. "Dynamic interactions between university-industry knowledge transfer channels: A case study of the most highly cited academic patent," Research Policy, Elsevier, vol. 46(2), pages 463-474.
    13. Thelwall, Mike, 2018. "Dimensions: A competitor to Scopus and the Web of Science?," Journal of Informetrics, Elsevier, vol. 12(2), pages 430-435.
    14. Adolfo Cazorla-Montero & Ignacio De los Ríos-Carmenado, 2023. "From “Putting the Last First” to “Working with People” in Rural Development Planning: A Bibliometric Analysis of 50 Years of Research," Sustainability, MDPI, vol. 15(13), pages 1-27, June.
    15. Sven E. Hug & Michael Ochsner & Martin P. Brändle, 2017. "Citation analysis with microsoft academic," Scientometrics, Springer;Akadémiai Kiadó, vol. 111(1), pages 371-378, April.
    16. Martín-Martín, Alberto & Orduna-Malea, Enrique & Thelwall, Mike & Delgado López-Cózar, Emilio, 2018. "Google Scholar, Web of Science, and Scopus: A systematic comparison of citations in 252 subject categories," Journal of Informetrics, Elsevier, vol. 12(4), pages 1160-1177.
    17. Shirley Ainsworth & Jane M. Russell, 2018. "Has hosting on science direct improved the visibility of Latin American scholarly journals? A preliminary analysis of data quality," Scientometrics, Springer;Akadémiai Kiadó, vol. 115(3), pages 1463-1484, June.
    18. Maor Weinberger & Maayan Zhitomirsky-Geffet, 2021. "Diversity of success: measuring the scholarly performance diversity of tenured professors in the Israeli academia," Scientometrics, Springer;Akadémiai Kiadó, vol. 126(4), pages 2931-2970, April.
    19. Irina Gerasimov & Binita KC & Armin Mehrabian & James Acker & Michael P. McGuire, 2024. "Comparison of datasets citation coverage in Google Scholar, Web of Science, Scopus, Crossref, and DataCite," Scientometrics, Springer;Akadémiai Kiadó, vol. 129(7), pages 3681-3704, July.
    20. Alberto Martín-Martín & Enrique Orduna-Malea & Emilio Delgado López-Cózar, 2018. "A novel method for depicting academic disciplines through Google Scholar Citations: The case of Bibliometrics," Scientometrics, Springer;Akadémiai Kiadó, vol. 114(3), pages 1251-1273, March.
    21. Franceschini, Fiorenzo & Maisano, Domenico & Mastrogiacomo, Luca, 2016. "Empirical analysis and classification of database errors in Scopus and Web of Science," Journal of Informetrics, Elsevier, vol. 10(4), pages 933-953.
    22. John Mingers & Jesse R. O’Hanley & Musbaudeen Okunola, 2017. "Using Google Scholar institutional level data to evaluate the quality of university research," Scientometrics, Springer;Akadémiai Kiadó, vol. 113(3), pages 1627-1643, December.
    23. Houqiang Yu & Xueting Cao & Tingting Xiao & Zhenyi Yang, 2020. "How accurate are policy document mentions? A first look at the role of altmetrics database," Scientometrics, Springer;Akadémiai Kiadó, vol. 125(2), pages 1517-1540, November.

    More about this item

    Statistics

    Access and download statistics

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:oup:rseval:v:25:y:2016:i:3:p:264-270.. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    We have no bibliographic references for this item. You can help adding them by using this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: Oxford University Press (email available below). General contact details of provider: https://academic.oup.com/rev .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.