IDEAS home Printed from https://ideas.repec.org/a/spr/scient/v113y2017i1d10.1007_s11192-017-2474-z.html
   My bibliography  Save this article

Methodological issues in measuring citations in Wikipedia: a case study in Library and Information Science

Author

Listed:
  • Aida Pooladian

    (Universitat de Barcelona)

  • Ángel Borrego

    (Universitat de Barcelona)

Abstract

Wikipedia citations have been suggested as a metric that partially captures the impact of research, providing an indication of the transfer of scholarly output to a wider audience beyond the academic community. In this article, we explore the coverage of Library and Information Science literature published between 2001 and 2010 in Wikipedia, paying special attention to the methodological issues involved in counting Wikipedia citations. The results reveal severe limitations in the use of Wikipedia citations for research evaluation. Lack of standardization and incompleteness of Wikipedia references make it difficult to retrieve them. The number of Wikipedia citations is very low, with less than 3% of articles in the sample having been cited. A significant number of references are cited in biographical entries about the authors of the articles, resulting in a phenomenon of accumulated advantage, which is similar to the Matthew effect. Nearly one-third of the Wikipedia citations link to an open access source, although this result is probably an underestimate of open access availability, given the incompleteness of Wikipedia citations.

Suggested Citation

  • Aida Pooladian & Ángel Borrego, 2017. "Methodological issues in measuring citations in Wikipedia: a case study in Library and Information Science," Scientometrics, Springer;Akadémiai Kiadó, vol. 113(1), pages 455-464, October.
  • Handle: RePEc:spr:scient:v:113:y:2017:i:1:d:10.1007_s11192-017-2474-z
    DOI: 10.1007/s11192-017-2474-z
    as

    Download full text from publisher

    File URL: http://link.springer.com/10.1007/s11192-017-2474-z
    File Function: Abstract
    Download Restriction: Access to the full text of the articles in this series is restricted.

    File URL: https://libkey.io/10.1007/s11192-017-2474-z?utm_source=ideas
    LibKey link: if access is restricted and if your library uses this service, LibKey will redirect you to where you can use your library subscription to access this item
    ---><---

    As the access to this document is restricted, you may want to search for a different version of it.

    References listed on IDEAS

    as
    1. Fiorenzo Franceschini & Domenico Maisano & Luca Mastrogiacomo, 2015. "Errors in DOI indexing by bibliometric databases," Scientometrics, Springer;Akadémiai Kiadó, vol. 102(3), pages 2181-2186, March.
    2. Misha Teplitskiy & Grace Lu & Eamon Duede, 2017. "Amplifying the impact of open access: Wikipedia and the diffusion of science," Journal of the Association for Information Science & Technology, Association for Information Science & Technology, vol. 68(9), pages 2116-2127, September.
    3. Brendan Luyt & Daniel Tan, 2010. "Improving Wikipedia's credibility: References and citations in a sample of history articles," Journal of the American Society for Information Science and Technology, Association for Information Science & Technology, vol. 61(4), pages 715-722, April.
    4. Franceschini, Fiorenzo & Maisano, Domenico & Mastrogiacomo, Luca, 2016. "The museum of errors/horrors in Scopus," Journal of Informetrics, Elsevier, vol. 10(1), pages 174-182.
    5. Brendan Luyt & Daniel Tan, 2010. "Improving Wikipedia's credibility: References and citations in a sample of history articles," Journal of the Association for Information Science & Technology, Association for Information Science & Technology, vol. 61(4), pages 715-722, April.
    6. Mostafa Mesgari & Chitu Okoli & Mohamad Mehdi & Finn Årup Nielsen & Arto Lanamäki, 2015. "“The sum of all human knowledge”: A systematic review of scholarly research on the content of Wikipedia," Journal of the Association for Information Science & Technology, Association for Information Science & Technology, vol. 66(2), pages 219-245, February.
    7. Kayvan Kousha & Mike Thelwall, 2017. "Are wikipedia citations important evidence of the impact of scholarly articles and books?," Journal of the Association for Information Science & Technology, Association for Information Science & Technology, vol. 68(3), pages 762-779, March.
    8. Stefanie Haustein & Isabella Peters & Judit Bar-Ilan & Jason Priem & Hadas Shema & Jens Terliesner, 2014. "Coverage and adoption of altmetrics sources in the bibliometric community," Scientometrics, Springer;Akadémiai Kiadó, vol. 101(2), pages 1145-1163, November.
    Full references (including those not matched with items on IDEAS)

    Citations

    Citations are extracted by the CitEc Project, subscribe to its RSS feed for this item.
    as


    Cited by:

    1. Marion Schmidt & Wolfgang Kircheis & Arno Simons & Martin Potthast & Benno Stein, 2023. "A diachronic perspective on citation latency in Wikipedia articles on CRISPR/Cas-9: an exploratory case study," Scientometrics, Springer;Akadémiai Kiadó, vol. 128(6), pages 3649-3673, June.
    2. Mingyang Wang & Jiaqi Zhang & Guangsheng Chen & Kah-Hin Chai, 2019. "Examining the influence of open access on journals’ citation obsolescence by modeling the actual citation process," Scientometrics, Springer;Akadémiai Kiadó, vol. 119(3), pages 1621-1641, June.
    3. Houcemeddine Turki & Mohamed Ali Hadj Taieb & Mohamed Ben Aouicha, 2020. "Facts to consider when analyzing the references of Nobel Prize scientific background," Scientometrics, Springer;Akadémiai Kiadó, vol. 124(1), pages 787-790, July.
    4. Xi Zhang & Xianhai Wang & Hongke Zhao & Patricia Ordóñez de Pablos & Yongqiang Sun & Hui Xiong, 2019. "An effectiveness analysis of altmetrics indices for different levels of artificial intelligence publications," Scientometrics, Springer;Akadémiai Kiadó, vol. 119(3), pages 1311-1344, June.

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Antonio Eleazar Serrano-López & Peter Ingwersen & Elias Sanz-Casado, 2017. "Wind power research in Wikipedia: Does Wikipedia demonstrate direct influence of research publications and can it be used as adequate source in research evaluation?," Scientometrics, Springer;Akadémiai Kiadó, vol. 112(3), pages 1471-1488, September.
    2. Marion Schmidt & Wolfgang Kircheis & Arno Simons & Martin Potthast & Benno Stein, 2023. "A diachronic perspective on citation latency in Wikipedia articles on CRISPR/Cas-9: an exploratory case study," Scientometrics, Springer;Akadémiai Kiadó, vol. 128(6), pages 3649-3673, June.
    3. Nicolas Jullien, 2012. "What We Know About Wikipedia: A Review of the Literature Analyzing the Project(s)," Post-Print hal-00857208, HAL.
    4. Thelwall, Mike, 2018. "Microsoft Academic automatic document searches: Accuracy for journal articles and suitability for citation analysis," Journal of Informetrics, Elsevier, vol. 12(1), pages 1-9.
    5. Sergio Copiello, 2019. "The open access citation premium may depend on the openness and inclusiveness of the indexing database, but the relationship is controversial because it is ambiguous where the open access boundary lie," Scientometrics, Springer;Akadémiai Kiadó, vol. 121(2), pages 995-1018, November.
    6. Bornmann, Lutz & Haunschild, Robin & Adams, Jonathan, 2019. "Do altmetrics assess societal impact in a comparable way to case studies? An empirical test of the convergent validity of altmetrics based on data from the UK research excellence framework (REF)," Journal of Informetrics, Elsevier, vol. 13(1), pages 325-340.
    7. Jaehun Joo & Ismatilla Normatov, 2013. "Determinants of collective intelligence quality: comparison between Wiki and Q&A services in English and Korean users," Service Business, Springer;Pan-Pacific Business Association, vol. 7(4), pages 687-711, December.
    8. Shirley Ainsworth & Jane M. Russell, 2018. "Has hosting on science direct improved the visibility of Latin American scholarly journals? A preliminary analysis of data quality," Scientometrics, Springer;Akadémiai Kiadó, vol. 115(3), pages 1463-1484, June.
    9. Junwen Zhu & Guangyuan Hu & Weishu Liu, 2019. "DOI errors and possible solutions for Web of Science," Scientometrics, Springer;Akadémiai Kiadó, vol. 118(2), pages 709-718, February.
    10. Amalia Mas-Bleda & Mike Thelwall, 2016. "Can alternative indicators overcome language biases in citation counts? A comparison of Spanish and UK research," Scientometrics, Springer;Akadémiai Kiadó, vol. 109(3), pages 2007-2030, December.
    11. Waltman, Ludo, 2016. "A review of the literature on citation impact indicators," Journal of Informetrics, Elsevier, vol. 10(2), pages 365-391.
    12. Igor Savchenko & Denis Kosyakov, 2022. "Lost in affiliation: apatride publications in international databases," Scientometrics, Springer;Akadémiai Kiadó, vol. 127(6), pages 3471-3487, June.
    13. Xiang Zheng & Jiajing Chen & Erjia Yan & Chaoqun Ni, 2023. "Gender and country biases in Wikipedia citations to scholarly publications," Journal of the Association for Information Science & Technology, Association for Information Science & Technology, vol. 74(2), pages 219-233, February.
    14. Xiaoling Huang & Lei Wang & Weishu Liu, 2023. "Identification of national research output using Scopus/Web of Science Core Collection: a revisit and further investigation," Scientometrics, Springer;Akadémiai Kiadó, vol. 128(4), pages 2337-2347, April.
    15. Alessia Cioffi & Sara Coppini & Arcangelo Massari & Arianna Moretti & Silvio Peroni & Cristian Santini & Nooshin Shahidzadeh Asadi, 2022. "Identifying and correcting invalid citations due to DOI errors in Crossref data," Scientometrics, Springer;Akadémiai Kiadó, vol. 127(6), pages 3593-3612, June.
    16. Junwen Zhu & Fang Liu & Weishu Liu, 2019. "The secrets behind Web of Science’s DOI search," Scientometrics, Springer;Akadémiai Kiadó, vol. 119(3), pages 1745-1753, June.
    17. Franceschini, Fiorenzo & Maisano, Domenico & Mastrogiacomo, Luca, 2016. "Empirical analysis and classification of database errors in Scopus and Web of Science," Journal of Informetrics, Elsevier, vol. 10(4), pages 933-953.
    18. Bo-Christer Björk & Sari Kanto-Karvonen & J. Tuomas Harviainen, 2020. "How Frequently Are Articles in Predatory Open Access Journals Cited," Publications, MDPI, vol. 8(2), pages 1-12, March.
    19. Ni Cheng & Ke Dong, 2018. "Knowledge communication on social media: a case study of Biomedical Science on Baidu Baike," Scientometrics, Springer;Akadémiai Kiadó, vol. 116(3), pages 1749-1770, September.
    20. Mikhail Rogov & Céline Rozenblat, 2018. "Urban Resilience Discourse Analysis: Towards a Multi-Level Approach to Cities," Sustainability, MDPI, vol. 10(12), pages 1-21, November.

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:spr:scient:v:113:y:2017:i:1:d:10.1007_s11192-017-2474-z. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: Sonal Shukla or Springer Nature Abstracting and Indexing (email available below). General contact details of provider: http://www.springer.com .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.