IDEAS home Printed from https://ideas.repec.org/a/spr/scient/v111y2017i1d10.1007_s11192-017-2247-8.html
   My bibliography  Save this article

Citation analysis with microsoft academic

Author

Listed:
  • Sven E. Hug

    (ETH Zurich
    University of Zurich)

  • Michael Ochsner

    (ETH Zurich
    University of Lausanne)

  • Martin P. Brändle

    (University of Zurich
    University of Zurich)

Abstract

We explore if and how Microsoft Academic (MA) could be used for bibliometric analyses. First, we examine the Academic Knowledge API (AK API), an interface to access MA data, and compare it to Google Scholar (GS). Second, we perform a comparative citation analysis of researchers by normalizing data from MA and Scopus. We find that MA offers structured and rich metadata, which facilitates data retrieval, handling and processing. In addition, the AK API allows retrieving frequency distributions of citations. We consider these features to be a major advantage of MA over GS. However, we identify four main limitations regarding the available metadata. First, MA does not provide the document type of a publication. Second, the “fields of study” are dynamic, too specific and field hierarchies are incoherent. Third, some publications are assigned to incorrect years. Fourth, the metadata of some publications did not include all authors. Nevertheless, we show that an average-based indicator (i.e. the journal normalized citation score; JNCS) as well as a distribution-based indicator (i.e. percentile rank classes; PR classes) can be calculated with relative ease using MA. Hence, normalization of citation counts is feasible with MA. The citation analyses in MA and Scopus yield uniform results. The JNCS and the PR classes are similar in both databases, and, as a consequence, the evaluation of the researchers’ publication impact is congruent in MA and Scopus. Given the fast development in the last year, we postulate that MA has the potential to be used for full-fledged bibliometric analyses.

Suggested Citation

  • Sven E. Hug & Michael Ochsner & Martin P. Brändle, 2017. "Citation analysis with microsoft academic," Scientometrics, Springer;Akadémiai Kiadó, vol. 111(1), pages 371-378, April.
  • Handle: RePEc:spr:scient:v:111:y:2017:i:1:d:10.1007_s11192-017-2247-8
    DOI: 10.1007/s11192-017-2247-8
    as

    Download full text from publisher

    File URL: http://link.springer.com/10.1007/s11192-017-2247-8
    File Function: Abstract
    Download Restriction: Access to the full text of the articles in this series is restricted.

    File URL: https://libkey.io/10.1007/s11192-017-2247-8?utm_source=ideas
    LibKey link: if access is restricted and if your library uses this service, LibKey will redirect you to where you can use your library subscription to access this item
    ---><---

    As the access to this document is restricted, you may want to search for a different version of it.

    References listed on IDEAS

    as
    1. Bornmann, Lutz & Leydesdorff, Loet & Mutz, Rüdiger, 2013. "The use of percentiles and percentile rank classes in the analysis of bibliometric data: Opportunities and limits," Journal of Informetrics, Elsevier, vol. 7(1), pages 158-165.
    2. Ad A.M. Prins & Rodrigo Costas & Thed N. van Leeuwen & Paul F. Wouters, 2016. "Using Google Scholar in research evaluation of humanities and social science programs: A comparison with Web of Science data," Research Evaluation, Oxford University Press, vol. 25(3), pages 264-270.
    3. Anne-Wil Harzing, 2016. "Microsoft Academic (Search): a Phoenix arisen from the ashes?," Scientometrics, Springer;Akadémiai Kiadó, vol. 108(3), pages 1637-1647, September.
    4. Anne-Wil Harzing & Satu Alakangas & David Adams, 2014. "hIa: an individual annual h-index to accommodate disciplinary and career length differences," Scientometrics, Springer;Akadémiai Kiadó, vol. 99(3), pages 811-821, June.
    5. Ludo Waltman & Michael Schreiber, 2013. "On the calculation of percentile-based bibliometric indicators," Journal of the Association for Information Science & Technology, Association for Information Science & Technology, vol. 64(2), pages 372-379, February.
    6. Lutz Bornmann & Andreas Thor & Werner Marx & Hermann Schier, 2016. "The application of bibliometrics to research evaluation in the humanities and social sciences: An exploratory study using normalized Google Scholar data for the publications of a research institute," Journal of the Association for Information Science & Technology, Association for Information Science & Technology, vol. 67(11), pages 2778-2789, November.
    7. Robert J. W. Tijssen & Martijn S. Visser & Thed N. van Leeuwen, 2002. "Benchmarking international scientific excellence: Are highly cited research papers an appropriate frame of reference?," Scientometrics, Springer;Akadémiai Kiadó, vol. 54(3), pages 381-397, July.
    Full references (including those not matched with items on IDEAS)

    Citations

    Citations are extracted by the CitEc Project, subscribe to its RSS feed for this item.
    as


    Cited by:

    1. Lutz Bornmann & Robin Haunschild & Sven E. Hug, 2018. "Visualizing the context of citations referencing papers published by Eugene Garfield: a new type of keyword co-occurrence analysis," Scientometrics, Springer;Akadémiai Kiadó, vol. 114(2), pages 427-437, February.
    2. Dunaiski, Marcel & Geldenhuys, Jaco & Visser, Willem, 2019. "Globalised vs averaged: Bias and ranking performance on the author level," Journal of Informetrics, Elsevier, vol. 13(1), pages 299-313.
    3. Martin Wieland & Juan Gorraiz, 2020. "The rivalry between Bernini and Borromini from a scientometric perspective," Scientometrics, Springer;Akadémiai Kiadó, vol. 125(2), pages 1643-1663, November.
    4. Klaus Kammerer & Manuel Göster & Manfred Reichert & Rüdiger Pryss, 2021. "Ambalytics: A Scalable and Distributed System Architecture Concept for Bibliometric Network Analyses," Future Internet, MDPI, vol. 13(8), pages 1-29, August.
    5. Nisar Ali & Zahid Halim & Syed Fawad Hussain, 2023. "An artificial intelligence-based framework for data-driven categorization of computer scientists: a case study of world’s Top 10 computing departments," Scientometrics, Springer;Akadémiai Kiadó, vol. 128(3), pages 1513-1545, March.
    6. Anne-Wil Harzing, 2019. "Two new kids on the block: How do Crossref and Dimensions compare with Google Scholar, Microsoft Academic, Scopus and the Web of Science?," Scientometrics, Springer;Akadémiai Kiadó, vol. 120(1), pages 341-349, July.
    7. Jung, Sukhwan & Yoon, Wan Chul, 2020. "An alternative topic model based on Common Interest Authors for topic evolution analysis," Journal of Informetrics, Elsevier, vol. 14(3).
    8. Jung, Sukhwan & Segev, Aviv, 2022. "DAC: Descendant-aware clustering algorithm for network-based topic emergence prediction," Journal of Informetrics, Elsevier, vol. 16(3).
    9. Kousha, Kayvan & Thelwall, Mike, 2018. "Can Microsoft Academic help to assess the citation impact of academic books?," Journal of Informetrics, Elsevier, vol. 12(3), pages 972-984.
    10. Thelwall, Mike, 2018. "Microsoft Academic automatic document searches: Accuracy for journal articles and suitability for citation analysis," Journal of Informetrics, Elsevier, vol. 12(1), pages 1-9.
    11. Zhentao Liang & Jin Mao & Kun Lu & Gang Li, 2021. "Finding citations for PubMed: a large-scale comparison between five freely available bibliographic data sources," Scientometrics, Springer;Akadémiai Kiadó, vol. 126(12), pages 9519-9542, December.
    12. Cristòfol Rovira & Lluís Codina & Frederic Guerrero-Solé & Carlos Lopezosa, 2019. "Ranking by Relevance and Citation Counts, a Comparative Study: Google Scholar, Microsoft Academic, WoS and Scopus," Future Internet, MDPI, vol. 11(9), pages 1-21, September.
    13. Liu, Meijun & Jaiswal, Ajay & Bu, Yi & Min, Chao & Yang, Sijie & Liu, Zhibo & Acuña, Daniel & Ding, Ying, 2022. "Team formation and team impact: The balance between team freshness and repeat collaboration," Journal of Informetrics, Elsevier, vol. 16(4).
    14. Xiancheng Li & Wenge Rong & Haoran Shi & Jie Tang & Zhang Xiong, 2018. "The impact of conference ranking systems in computer science: a comparative regression analysis," Scientometrics, Springer;Akadémiai Kiadó, vol. 116(2), pages 879-907, August.
    15. Thelwall, Mike, 2018. "Dimensions: A competitor to Scopus and the Web of Science?," Journal of Informetrics, Elsevier, vol. 12(2), pages 430-435.
    16. Lutz Bornmann & K. Brad Wray & Robin Haunschild, 2020. "Citation concept analysis (CCA): a new form of citation analysis revealing the usefulness of concepts for other researchers illustrated by exemplary case studies including classic books by Thomas S. K," Scientometrics, Springer;Akadémiai Kiadó, vol. 122(2), pages 1051-1074, February.
    17. Tahamtan, Iman & Bornmann, Lutz, 2018. "Core elements in the process of citing publications: Conceptual overview of the literature," Journal of Informetrics, Elsevier, vol. 12(1), pages 203-216.
    18. Robin Haunschild & Sven E. Hug & Martin P. Brändle & Lutz Bornmann, 2018. "The number of linked references of publications in Microsoft Academic in comparison with the Web of Science," Scientometrics, Springer;Akadémiai Kiadó, vol. 114(1), pages 367-370, January.
    19. van der Wouden, Frank & Youn, Hyejin, 2023. "The impact of geographical distance on learning through collaboration," Research Policy, Elsevier, vol. 52(2).
    20. Dunaiski, Marcel & Geldenhuys, Jaco & Visser, Willem, 2019. "On the interplay between normalisation, bias, and performance of paper impact metrics," Journal of Informetrics, Elsevier, vol. 13(1), pages 270-290.
    21. Vaccario, Giacomo & Medo, Matúš & Wider, Nicolas & Mariani, Manuel Sebastian, 2017. "Quantifying and suppressing ranking bias in a large citation network," Journal of Informetrics, Elsevier, vol. 11(3), pages 766-782.
    22. Kousha, Kayvan & Thelwall, Mike & Abdoli, Mahshid, 2018. "Can Microsoft Academic assess the early citation impact of in-press articles? A multi-discipline exploratory analysis," Journal of Informetrics, Elsevier, vol. 12(1), pages 287-298.

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Loet Leydesdorff & Paul Wouters & Lutz Bornmann, 2016. "Professional and citizen bibliometrics: complementarities and ambivalences in the development and use of indicators—a state-of-the-art report," Scientometrics, Springer;Akadémiai Kiadó, vol. 109(3), pages 2129-2150, December.
    2. John Mingers & Martin Meyer, 2017. "Normalizing Google Scholar data for use in research evaluation," Scientometrics, Springer;Akadémiai Kiadó, vol. 112(2), pages 1111-1121, August.
    3. Rodríguez-Navarro, Alonso & Brito, Ricardo, 2018. "Technological research in the EU is less efficient than the world average. EU research policy risks Europeans’ future," Journal of Informetrics, Elsevier, vol. 12(3), pages 718-731.
    4. Brito, Ricardo & Rodríguez-Navarro, Alonso, 2018. "Research assessment by percentile-based double rank analysis," Journal of Informetrics, Elsevier, vol. 12(1), pages 315-329.
    5. Alonso Rodríguez-Navarro & Ricardo Brito, 2019. "Probability and expected frequency of breakthroughs: basis and use of a robust method of research assessment," Scientometrics, Springer;Akadémiai Kiadó, vol. 119(1), pages 213-235, April.
    6. Thelwall, Mike, 2016. "The precision of the arithmetic mean, geometric mean and percentiles for citation data: An experimental simulation modelling approach," Journal of Informetrics, Elsevier, vol. 10(1), pages 110-123.
    7. Schreiber, Michael, 2014. "How to improve the outcome of performance evaluations in terms of percentiles for citation frequencies of my papers," Journal of Informetrics, Elsevier, vol. 8(4), pages 873-879.
    8. Anne-Wil Harzing & Satu Alakangas, 2017. "Microsoft Academic is one year old: the Phoenix is ready to leave the nest," Scientometrics, Springer;Akadémiai Kiadó, vol. 112(3), pages 1887-1894, September.
    9. Mingers, John & Yang, Liying, 2017. "Evaluating journal quality: A review of journal citation indicators and ranking in business and management," European Journal of Operational Research, Elsevier, vol. 257(1), pages 323-337.
    10. Lutz Bornmann & Werner Marx & Andreas Barth, 2013. "The Normalization of Citation Counts Based on Classification Systems," Publications, MDPI, vol. 1(2), pages 1-9, August.
    11. John Mingers & Jesse R. O’Hanley & Musbaudeen Okunola, 2017. "Using Google Scholar institutional level data to evaluate the quality of university research," Scientometrics, Springer;Akadémiai Kiadó, vol. 113(3), pages 1627-1643, December.
    12. Loet Leydesdorff & Lutz Bornmann & Jonathan Adams, 2019. "The integrated impact indicator revisited (I3*): a non-parametric alternative to the journal impact factor," Scientometrics, Springer;Akadémiai Kiadó, vol. 119(3), pages 1669-1694, June.
    13. Hamdi A. Al-Jamimi & Galal M. BinMakhashen & Lutz Bornmann & Yousif Ahmed Al Wajih, 2023. "Saudi Arabia research: academic insights and trend analysis," Scientometrics, Springer;Akadémiai Kiadó, vol. 128(10), pages 5595-5627, October.
    14. Alberto Martín-Martín & Enrique Orduna-Malea & Emilio Delgado López-Cózar, 2018. "A novel method for depicting academic disciplines through Google Scholar Citations: The case of Bibliometrics," Scientometrics, Springer;Akadémiai Kiadó, vol. 114(3), pages 1251-1273, March.
    15. Lutz Bornmann & Alexander Tekles & Loet Leydesdorff, 2019. "How well does I3 perform for impact measurement compared to other bibliometric indicators? The convergent validity of several (field-normalized) indicators," Scientometrics, Springer;Akadémiai Kiadó, vol. 119(2), pages 1187-1205, May.
    16. Mingers, John & Leydesdorff, Loet, 2015. "A review of theory and practice in scientometrics," European Journal of Operational Research, Elsevier, vol. 246(1), pages 1-19.
    17. Anne-Wil Harzing & Satu Alakangas, 2017. "Microsoft Academic: is the phoenix getting wings?," Scientometrics, Springer;Akadémiai Kiadó, vol. 110(1), pages 371-383, January.
    18. Chen, Shiji & Qiu, Junping & Arsenault, Clément & Larivière, Vincent, 2021. "Exploring the interdisciplinarity patterns of highly cited papers," Journal of Informetrics, Elsevier, vol. 15(1).
    19. Gerson Pech & Catarina Delgado, 2020. "Assessing the publication impact using citation data from both Scopus and WoS databases: an approach validated in 15 research fields," Scientometrics, Springer;Akadémiai Kiadó, vol. 125(2), pages 909-924, November.
    20. Lucília Cardoso & Rui Silva & Giovana Goretti Feijó de Almeida & Luís Lima Santos, 2020. "A Bibliometric Model to Analyze Country Research Performance: SciVal Topic Prominence Approach in Tourism, Leisure and Hospitality," Sustainability, MDPI, vol. 12(23), pages 1-26, November.

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:spr:scient:v:111:y:2017:i:1:d:10.1007_s11192-017-2247-8. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: Sonal Shukla or Springer Nature Abstracting and Indexing (email available below). General contact details of provider: http://www.springer.com .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.