IDEAS home Printed from https://ideas.repec.org/a/eee/infome/v12y2018i2p430-435.html
   My bibliography  Save this article

Dimensions: A competitor to Scopus and the Web of Science?

Author

Listed:
  • Thelwall, Mike

Abstract

Dimensions is a partly free scholarly database launched by Digital Science in January 2018. Dimensions includes journal articles and citation counts, making it a potential new source of impact data. This article explores the value of Dimensions from an impact assessment perspective with an examination of Food Science research 2008–2018 and a random sample of 10,000 Scopus articles from 2012. The results include high correlations between citation counts from Scopus and Dimensions (0.96 by narrow field in 2012) as well as similar average counts. Almost all Scopus articles with DOIs were found in Dimensions (97% in 2012). Thus, the scholarly database component of Dimensions seems to be a plausible alternative to Scopus and the Web of Science for general citation analyses and for citation data in support of some types of research evaluations.

Suggested Citation

  • Thelwall, Mike, 2018. "Dimensions: A competitor to Scopus and the Web of Science?," Journal of Informetrics, Elsevier, vol. 12(2), pages 430-435.
  • Handle: RePEc:eee:infome:v:12:y:2018:i:2:p:430-435
    DOI: 10.1016/j.joi.2018.03.006
    as

    Download full text from publisher

    File URL: http://www.sciencedirect.com/science/article/pii/S175115771830066X
    Download Restriction: Full text for ScienceDirect subscribers only

    File URL: https://libkey.io/10.1016/j.joi.2018.03.006?utm_source=ideas
    LibKey link: if access is restricted and if your library uses this service, LibKey will redirect you to where you can use your library subscription to access this item
    ---><---

    As the access to this document is restricted, you may want to search for a different version of it.

    References listed on IDEAS

    as
    1. Martin-Martin, Alberto & Orduna-Malea, Enrique & Harzing, Anne-Wil & Delgado López-Cózar, Emilio, 2017. "Can we use Google Scholar to identify highly-cited documents?," Journal of Informetrics, Elsevier, vol. 11(1), pages 152-163.
    2. Emilio Delgado López-Cózar & Nicolás Robinson-García & Daniel Torres-Salinas, 2014. "The Google scholar experiment: How to index false papers and manipulate bibliometric indicators," Journal of the Association for Information Science & Technology, Association for Information Science & Technology, vol. 65(3), pages 446-454, March.
    3. Fairclough, Ruth & Thelwall, Mike, 2015. "More precise methods for national research citation impact comparisons," Journal of Informetrics, Elsevier, vol. 9(4), pages 895-906.
    4. Philippe Mongeon & Adèle Paul-Hus, 2016. "The journal coverage of Web of Science and Scopus: a comparative analysis," Scientometrics, Springer;Akadémiai Kiadó, vol. 106(1), pages 213-228, January.
    5. A Cecile J W Janssens & Michael Goodman & Kimberly R Powell & Marta Gwinn, 2017. "A critical evaluation of the algorithm behind the Relative Citation Ratio (RCR)," PLOS Biology, Public Library of Science, vol. 15(10), pages 1-5, October.
    6. Mike Thelwall, 2017. "Are Mendeley reader counts useful impact indicators in all fields?," Scientometrics, Springer;Akadémiai Kiadó, vol. 113(3), pages 1721-1731, December.
    7. Waltman, Ludo & van Eck, Nees Jan & van Leeuwen, Thed N. & Visser, Martijn S. & van Raan, Anthony F.J., 2011. "Towards a new crown indicator: Some theoretical considerations," Journal of Informetrics, Elsevier, vol. 5(1), pages 37-47.
    8. Éric Archambault & David Campbell & Yves Gingras & Vincent Larivière, 2009. "Comparing bibliometric statistics obtained from the Web of Science and Scopus," Journal of the American Society for Information Science and Technology, Association for Information Science & Technology, vol. 60(7), pages 1320-1326, July.
    9. Anne-Wil Harzing & Satu Alakangas, 2017. "Microsoft Academic is one year old: the Phoenix is ready to leave the nest," Scientometrics, Springer;Akadémiai Kiadó, vol. 112(3), pages 1887-1894, September.
    10. Thelwall, Mike, 2018. "Microsoft Academic automatic document searches: Accuracy for journal articles and suitability for citation analysis," Journal of Informetrics, Elsevier, vol. 12(1), pages 1-9.
    11. Ad A.M. Prins & Rodrigo Costas & Thed N. van Leeuwen & Paul F. Wouters, 2016. "Using Google Scholar in research evaluation of humanities and social science programs: A comparison with Web of Science data," Research Evaluation, Oxford University Press, vol. 25(3), pages 264-270.
    12. Franceschet, Massimo & Costantini, Antonio, 2011. "The first Italian research assessment exercise: A bibliometric perspective," Journal of Informetrics, Elsevier, vol. 5(2), pages 275-291.
    13. Sven E. Hug & Michael Ochsner & Martin P. Brändle, 2017. "Citation analysis with microsoft academic," Scientometrics, Springer;Akadémiai Kiadó, vol. 111(1), pages 371-378, April.
    14. Rodrigo Costas & Zohreh Zahedi & Paul Wouters, 2015. "Do “altmetrics” correlate with citations? Extensive comparison of altmetric indicators with citations from a multidisciplinary perspective," Journal of the Association for Information Science & Technology, Association for Information Science & Technology, vol. 66(10), pages 2003-2019, October.
    15. B Ian Hutchins & Xin Yuan & James M Anderson & George M Santangelo, 2016. "Relative Citation Ratio (RCR): A New Metric That Uses Citation Rates to Measure Influence at the Article Level," PLOS Biology, Public Library of Science, vol. 14(9), pages 1-25, September.
    16. Halevi, Gali & Moed, Henk & Bar-Ilan, Judit, 2017. "Suitability of Google Scholar as a source of scientific information and as a source of data for scientific evaluation—Review of the Literature," Journal of Informetrics, Elsevier, vol. 11(3), pages 823-834.
    17. Waltman, Ludo, 2016. "A review of the literature on citation impact indicators," Journal of Informetrics, Elsevier, vol. 10(2), pages 365-391.
    Full references (including those not matched with items on IDEAS)

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Dunaiski, Marcel & Geldenhuys, Jaco & Visser, Willem, 2019. "On the interplay between normalisation, bias, and performance of paper impact metrics," Journal of Informetrics, Elsevier, vol. 13(1), pages 270-290.
    2. Thelwall, Mike, 2018. "Microsoft Academic automatic document searches: Accuracy for journal articles and suitability for citation analysis," Journal of Informetrics, Elsevier, vol. 12(1), pages 1-9.
    3. Kousha, Kayvan & Thelwall, Mike & Abdoli, Mahshid, 2018. "Can Microsoft Academic assess the early citation impact of in-press articles? A multi-discipline exploratory analysis," Journal of Informetrics, Elsevier, vol. 12(1), pages 287-298.
    4. Waltman, Ludo, 2016. "A review of the literature on citation impact indicators," Journal of Informetrics, Elsevier, vol. 10(2), pages 365-391.
    5. Michael Thelwall, 2018. "Can Microsoft Academic be used for citation analysis of preprint archives? The case of the Social Science Research Network," Scientometrics, Springer;Akadémiai Kiadó, vol. 115(2), pages 913-928, May.
    6. Raminta Pranckutė, 2021. "Web of Science (WoS) and Scopus: The Titans of Bibliographic Information in Today’s Academic World," Publications, MDPI, vol. 9(1), pages 1-59, March.
    7. Michael Gusenbauer, 2019. "Google Scholar to overshadow them all? Comparing the sizes of 12 academic search engines and bibliographic databases," Scientometrics, Springer;Akadémiai Kiadó, vol. 118(1), pages 177-214, January.
    8. Dunaiski, Marcel & Geldenhuys, Jaco & Visser, Willem, 2019. "Globalised vs averaged: Bias and ranking performance on the author level," Journal of Informetrics, Elsevier, vol. 13(1), pages 299-313.
    9. Martín-Martín, Alberto & Orduna-Malea, Enrique & Thelwall, Mike & Delgado López-Cózar, Emilio, 2018. "Google Scholar, Web of Science, and Scopus: A systematic comparison of citations in 252 subject categories," Journal of Informetrics, Elsevier, vol. 12(4), pages 1160-1177.
    10. Mike Thelwall, 2018. "Early Mendeley readers correlate with later citation counts," Scientometrics, Springer;Akadémiai Kiadó, vol. 115(3), pages 1231-1240, June.
    11. Thelwall, Mike, 2018. "Do females create higher impact research? Scopus citations and Mendeley readers for articles from five countries," Journal of Informetrics, Elsevier, vol. 12(4), pages 1031-1041.
    12. Liu, Meijun & Jaiswal, Ajay & Bu, Yi & Min, Chao & Yang, Sijie & Liu, Zhibo & Acuña, Daniel & Ding, Ying, 2022. "Team formation and team impact: The balance between team freshness and repeat collaboration," Journal of Informetrics, Elsevier, vol. 16(4).
    13. Sven E. Hug & Martin P. Brändle, 2017. "The coverage of Microsoft Academic: analyzing the publication output of a university," Scientometrics, Springer;Akadémiai Kiadó, vol. 113(3), pages 1551-1571, December.
    14. Bornmann, Lutz & Marx, Werner, 2018. "Critical rationalism and the search for standard (field-normalized) indicators in bibliometrics," Journal of Informetrics, Elsevier, vol. 12(3), pages 598-604.
    15. Alberto Martín-Martín & Enrique Orduna-Malea & Emilio Delgado López-Cózar, 2018. "Coverage of highly-cited documents in Google Scholar, Web of Science, and Scopus: a multidisciplinary comparison," Scientometrics, Springer;Akadémiai Kiadó, vol. 116(3), pages 2175-2188, September.
    16. Kousha, Kayvan & Thelwall, Mike, 2018. "Can Microsoft Academic help to assess the citation impact of academic books?," Journal of Informetrics, Elsevier, vol. 12(3), pages 972-984.
    17. de Carvalho, Gustavo Dambiski Gomes & Sokulski, Carla Cristiane & da Silva, Wesley Vieira & de Carvalho, Hélio Gomes & de Moura, Rafael Vignoli & de Francisco, Antonio Carlos & da Veiga, Claudimar Per, 2020. "Bibliometrics and systematic reviews: A comparison between the Proknow-C and the Methodi Ordinatio," Journal of Informetrics, Elsevier, vol. 14(3).
    18. Alberto Martín-Martín & Mike Thelwall & Enrique Orduna-Malea & Emilio Delgado López-Cózar, 2021. "Google Scholar, Microsoft Academic, Scopus, Dimensions, Web of Science, and OpenCitations’ COCI: a multidisciplinary comparison of coverage via citations," Scientometrics, Springer;Akadémiai Kiadó, vol. 126(1), pages 871-906, January.
    19. repec:hal:cepnwp:hal-01922266 is not listed on IDEAS
    20. Zhentao Liang & Jin Mao & Kun Lu & Gang Li, 2021. "Finding citations for PubMed: a large-scale comparison between five freely available bibliographic data sources," Scientometrics, Springer;Akadémiai Kiadó, vol. 126(12), pages 9519-9542, December.
    21. Mike Thelwall, 2017. "Are Mendeley reader counts useful impact indicators in all fields?," Scientometrics, Springer;Akadémiai Kiadó, vol. 113(3), pages 1721-1731, December.

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:eee:infome:v:12:y:2018:i:2:p:430-435. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: Catherine Liu (email available below). General contact details of provider: http://www.elsevier.com/locate/joi .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.