IDEAS home Printed from https://ideas.repec.org/a/spr/scient/v110y2017i1d10.1007_s11192-016-2185-x.html
   My bibliography  Save this article

Microsoft Academic: is the phoenix getting wings?

Author

Listed:
  • Anne-Wil Harzing

    (Middlesex University)

  • Satu Alakangas

    (University of Melbourne)

Abstract

In this article, we compare publication and citation coverage of the new Microsoft Academic with all other major sources for bibliometric data: Google Scholar, Scopus, and the Web of Science, using a sample of 145 academics in five broad disciplinary areas: Life Sciences, Sciences, Engineering, Social Sciences, and Humanities. When using the more conservative linked citation counts for Microsoft Academic, this data-source provides higher citation counts than both Scopus and the Web of Science for Engineering, the Social Sciences, and the Humanities, whereas citation counts for the Life Sciences and the Sciences are fairly similar across these three databases. Google Scholar still reports the highest citation counts for all disciplines. When using the more liberal estimated citation counts for Microsoft Academic, its average citations counts are higher than both Scopus and the Web of Science for all disciplines. For the Life Sciences, Microsoft Academic estimated citation counts are higher even than Google Scholar counts, whereas for the Sciences they are almost identical. For Engineering, Microsoft Academic estimated citation counts are 14% lower than Google Scholar citation counts, whereas for the Social Sciences this is 23%. Only for the Humanities are they substantially (69%) lower than Google Scholar citations counts. Overall, this first large-scale comparative study suggests that the new incarnation of Microsoft Academic presents us with an excellent alternative for citation analysis. We therefore conclude that the Microsoft Academic Phoenix is undeniably growing wings; it might be ready to fly off and start its adult life in the field of research evaluation soon.

Suggested Citation

  • Anne-Wil Harzing & Satu Alakangas, 2017. "Microsoft Academic: is the phoenix getting wings?," Scientometrics, Springer;Akadémiai Kiadó, vol. 110(1), pages 371-383, January.
  • Handle: RePEc:spr:scient:v:110:y:2017:i:1:d:10.1007_s11192-016-2185-x
    DOI: 10.1007/s11192-016-2185-x
    as

    Download full text from publisher

    File URL: http://link.springer.com/10.1007/s11192-016-2185-x
    File Function: Abstract
    Download Restriction: Access to the full text of the articles in this series is restricted.

    File URL: https://libkey.io/10.1007/s11192-016-2185-x?utm_source=ideas
    LibKey link: if access is restricted and if your library uses this service, LibKey will redirect you to where you can use your library subscription to access this item
    ---><---

    As the access to this document is restricted, you may want to search for a different version of it.

    References listed on IDEAS

    as
    1. Anne-Wil Harzing & Satu Alakangas, 2016. "Google Scholar, Scopus and the Web of Science: a longitudinal and cross-disciplinary comparison," Scientometrics, Springer;Akadémiai Kiadó, vol. 106(2), pages 787-804, February.
    2. Anne-Wil Harzing, 2016. "Microsoft Academic (Search): a Phoenix arisen from the ashes?," Scientometrics, Springer;Akadémiai Kiadó, vol. 108(3), pages 1637-1647, September.
    3. Anne-Wil Harzing & Satu Alakangas & David Adams, 2014. "hIa: an individual annual h-index to accommodate disciplinary and career length differences," Scientometrics, Springer;Akadémiai Kiadó, vol. 99(3), pages 811-821, June.
    4. Lorna Wildgaard, 2015. "A comparison of 17 author-level bibliometric indicators for researchers in Astronomy, Environmental Science, Philosophy and Public Health in Web of Science and Google Scholar," Scientometrics, Springer;Akadémiai Kiadó, vol. 104(3), pages 873-906, September.
    Full references (including those not matched with items on IDEAS)

    Citations

    Citations are extracted by the CitEc Project, subscribe to its RSS feed for this item.
    as


    Cited by:

    1. Anne-Wil Harzing & Satu Alakangas, 2017. "Microsoft Academic is one year old: the Phoenix is ready to leave the nest," Scientometrics, Springer;Akadémiai Kiadó, vol. 112(3), pages 1887-1894, September.
    2. Thelwall, Mike, 2018. "Microsoft Academic automatic document searches: Accuracy for journal articles and suitability for citation analysis," Journal of Informetrics, Elsevier, vol. 12(1), pages 1-9.
    3. Yihui Lan & Kenneth W Clements & Zong Ken Chai, 2022. "Australian PhDs in Economics and Finance: Professional Activities, Productivity and Prospects," Economics Discussion / Working Papers 22-04, The University of Western Australia, Department of Economics.
    4. Zhentao Liang & Jin Mao & Kun Lu & Gang Li, 2021. "Finding citations for PubMed: a large-scale comparison between five freely available bibliographic data sources," Scientometrics, Springer;Akadémiai Kiadó, vol. 126(12), pages 9519-9542, December.
    5. Cristòfol Rovira & Lluís Codina & Frederic Guerrero-Solé & Carlos Lopezosa, 2019. "Ranking by Relevance and Citation Counts, a Comparative Study: Google Scholar, Microsoft Academic, WoS and Scopus," Future Internet, MDPI, vol. 11(9), pages 1-21, September.
    6. Sumiko Asai, 2020. "The effect of collaboration with large publishers on the internationality and influence of open access journals for research institutions," Scientometrics, Springer;Akadémiai Kiadó, vol. 124(1), pages 663-677, July.
    7. Sumiko Asai, 2021. "Collaboration between research institutes and large and small publishers for publishing open access journals," Scientometrics, Springer;Akadémiai Kiadó, vol. 126(6), pages 5245-5262, June.
    8. Xiancheng Li & Wenge Rong & Haoran Shi & Jie Tang & Zhang Xiong, 2018. "The impact of conference ranking systems in computer science: a comparative regression analysis," Scientometrics, Springer;Akadémiai Kiadó, vol. 116(2), pages 879-907, August.
    9. Abdelghani Maddi & Aouatif de La Laurencie, 2018. "La dynamique des SHS françaises dans le Web of Science : un manque de représentativité ou de visibilité internationale ?," Working Papers hal-01922266, HAL.
    10. Robin Haunschild & Sven E. Hug & Martin P. Brändle & Lutz Bornmann, 2018. "The number of linked references of publications in Microsoft Academic in comparison with the Web of Science," Scientometrics, Springer;Akadémiai Kiadó, vol. 114(1), pages 367-370, January.
    11. V. A. Traag & L. Waltman, 2019. "Systematic analysis of agreement between metrics and peer review in the UK REF," Palgrave Communications, Palgrave Macmillan, vol. 5(1), pages 1-12, December.
    12. Dunaiski, Marcel & Geldenhuys, Jaco & Visser, Willem, 2019. "On the interplay between normalisation, bias, and performance of paper impact metrics," Journal of Informetrics, Elsevier, vol. 13(1), pages 270-290.
    13. Goshu Desalegn & Anita Tangl, 2022. "Banning Vs Taxing, Reviewing the Potential Opportunities and Challenges of Plastic Products," Sustainability, MDPI, vol. 14(12), pages 1-16, June.
    14. Kousha, Kayvan & Thelwall, Mike & Abdoli, Mahshid, 2018. "Can Microsoft Academic assess the early citation impact of in-press articles? A multi-discipline exploratory analysis," Journal of Informetrics, Elsevier, vol. 12(1), pages 287-298.
    15. Arash Najmaei & Zahra Sadeghinejad, 2023. "Green and sustainable business models: historical roots, growth trajectory, conceptual architecture and an agenda for future research—A bibliometric review of green and sustainable business models," Scientometrics, Springer;Akadémiai Kiadó, vol. 128(2), pages 957-999, February.
    16. Anand, Amitabh & Argade, Padmaja & Barkemeyer, Ralf & Salignac, Fanny, 2021. "Trends and patterns in sustainable entrepreneurship research: A bibliometric review and research agenda," Journal of Business Venturing, Elsevier, vol. 36(3).
    17. Abdelghani Maddi & Aouatif de La Laurencie, 2018. "La dynamique des SHS françaises dans le Web of Science : un manque de représentativité ou de visibilité internationale ?," CEPN Working Papers hal-01922266, HAL.
    18. Abdelghani Maddi & Aouatif De La Laurencie, 2018. "La dynamique des SHS françaises dans le Web of Science," CEPN Working Papers 2018-05, Centre d'Economie de l'Université de Paris Nord.
    19. Anne-Wil Harzing, 2019. "Two new kids on the block: How do Crossref and Dimensions compare with Google Scholar, Microsoft Academic, Scopus and the Web of Science?," Scientometrics, Springer;Akadémiai Kiadó, vol. 120(1), pages 341-349, July.
    20. Kousha, Kayvan & Thelwall, Mike, 2018. "Can Microsoft Academic help to assess the citation impact of academic books?," Journal of Informetrics, Elsevier, vol. 12(3), pages 972-984.
    21. Sven E. Hug & Martin P. Brändle, 2017. "The coverage of Microsoft Academic: analyzing the publication output of a university," Scientometrics, Springer;Akadémiai Kiadó, vol. 113(3), pages 1551-1571, December.
    22. Mike Thelwall & Nabeil Maflahi, 2020. "Academic collaboration rates and citation associations vary substantially between countries and fields," Journal of the Association for Information Science & Technology, Association for Information Science & Technology, vol. 71(8), pages 968-978, August.
    23. Mike Thelwall, 2018. "Does Microsoft Academic find early citations?," Scientometrics, Springer;Akadémiai Kiadó, vol. 114(1), pages 325-334, January.
    24. Michael Thelwall, 2018. "Can Microsoft Academic be used for citation analysis of preprint archives? The case of the Social Science Research Network," Scientometrics, Springer;Akadémiai Kiadó, vol. 115(2), pages 913-928, May.
    25. Alberto Martín-Martín & Mike Thelwall & Enrique Orduna-Malea & Emilio Delgado López-Cózar, 2021. "Google Scholar, Microsoft Academic, Scopus, Dimensions, Web of Science, and OpenCitations’ COCI: a multidisciplinary comparison of coverage via citations," Scientometrics, Springer;Akadémiai Kiadó, vol. 126(1), pages 871-906, January.

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Anne-Wil Harzing & Satu Alakangas, 2017. "Microsoft Academic is one year old: the Phoenix is ready to leave the nest," Scientometrics, Springer;Akadémiai Kiadó, vol. 112(3), pages 1887-1894, September.
    2. Fabio Zagonari, 2019. "Scientific Production and Productivity for Characterizing an Author’s Publication History: Simple and Nested Gini’s and Hirsch’s Indexes Combined," Publications, MDPI, vol. 7(2), pages 1-30, May.
    3. Anne-Wil Harzing, 2016. "Microsoft Academic (Search): a Phoenix arisen from the ashes?," Scientometrics, Springer;Akadémiai Kiadó, vol. 108(3), pages 1637-1647, September.
    4. John Mingers & Martin Meyer, 2017. "Normalizing Google Scholar data for use in research evaluation," Scientometrics, Springer;Akadémiai Kiadó, vol. 112(2), pages 1111-1121, August.
    5. Martín-Martín, Alberto & Orduna-Malea, Enrique & Thelwall, Mike & Delgado López-Cózar, Emilio, 2018. "Google Scholar, Web of Science, and Scopus: A systematic comparison of citations in 252 subject categories," Journal of Informetrics, Elsevier, vol. 12(4), pages 1160-1177.
    6. Wildgaard, Lorna, 2016. "A critical cluster analysis of 44 indicators of author-level performance," Journal of Informetrics, Elsevier, vol. 10(4), pages 1055-1078.
    7. James C. Ryan, 2016. "A validation of the individual annual h-index (hIa): application of the hIa to a qualitatively and quantitatively different sample," Scientometrics, Springer;Akadémiai Kiadó, vol. 109(1), pages 577-590, October.
    8. Waltman, Ludo, 2016. "A review of the literature on citation impact indicators," Journal of Informetrics, Elsevier, vol. 10(2), pages 365-391.
    9. Belussi, Fiorenza & Orsi, Luigi & Savarese, Maria, 2019. "Mapping Business Model Research: A Document Bibliometric Analysis," Scandinavian Journal of Management, Elsevier, vol. 35(3).
    10. Sven E. Hug & Michael Ochsner & Martin P. Brändle, 2017. "Citation analysis with microsoft academic," Scientometrics, Springer;Akadémiai Kiadó, vol. 111(1), pages 371-378, April.
    11. Andersen, Jens Peter & Nielsen, Mathias Wullum, 2018. "Google Scholar and Web of Science: Examining gender differences in citation coverage across five scientific disciplines," Journal of Informetrics, Elsevier, vol. 12(3), pages 950-959.
    12. Franceschini, Fiorenzo & Maisano, Domenico & Mastrogiacomo, Luca, 2016. "Empirical analysis and classification of database errors in Scopus and Web of Science," Journal of Informetrics, Elsevier, vol. 10(4), pages 933-953.
    13. Anne-Wil Harzing & Satu Alakangas, 2016. "Google Scholar, Scopus and the Web of Science: a longitudinal and cross-disciplinary comparison," Scientometrics, Springer;Akadémiai Kiadó, vol. 106(2), pages 787-804, February.
    14. Moed, Henk F. & Bar-Ilan, Judit & Halevi, Gali, 2016. "A new methodology for comparing Google Scholar and Scopus," Journal of Informetrics, Elsevier, vol. 10(2), pages 533-551.
    15. Claudiu Herteliu & Marcel Ausloos & Bogdan Vasile Ileanu & Giulia Rotundo & Tudorel Andrei, 2017. "Quantitative and Qualitative Analysis of Editor Behavior through Potentially Coercive Citations," Publications, MDPI, vol. 5(2), pages 1-16, June.
    16. Anne-Wil Harzing, 2019. "Two new kids on the block: How do Crossref and Dimensions compare with Google Scholar, Microsoft Academic, Scopus and the Web of Science?," Scientometrics, Springer;Akadémiai Kiadó, vol. 120(1), pages 341-349, July.
    17. Maor Weinberger & Maayan Zhitomirsky-Geffet, 2021. "Diversity of success: measuring the scholarly performance diversity of tenured professors in the Israeli academia," Scientometrics, Springer;Akadémiai Kiadó, vol. 126(4), pages 2931-2970, April.
    18. Mike Thelwall, 2018. "Does Microsoft Academic find early citations?," Scientometrics, Springer;Akadémiai Kiadó, vol. 114(1), pages 325-334, January.
    19. Drahomira Herrmannova & Robert M. Patton & Petr Knoth & Christopher G. Stahl, 2018. "Do citations and readership identify seminal publications?," Scientometrics, Springer;Akadémiai Kiadó, vol. 115(1), pages 239-262, April.
    20. Halevi, Gali & Moed, Henk & Bar-Ilan, Judit, 2017. "Suitability of Google Scholar as a source of scientific information and as a source of data for scientific evaluation—Review of the Literature," Journal of Informetrics, Elsevier, vol. 11(3), pages 823-834.

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:spr:scient:v:110:y:2017:i:1:d:10.1007_s11192-016-2185-x. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: Sonal Shukla or Springer Nature Abstracting and Indexing (email available below). General contact details of provider: http://www.springer.com .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.