IDEAS home Printed from https://ideas.repec.org/a/spr/scient/v126y2021i6d10.1007_s11192-021-03962-7.html
   My bibliography  Save this article

Can tweets be used to detect problems early with scientific papers? A case study of three retracted COVID-19/SARS-CoV-2 papers

Author

Listed:
  • Robin Haunschild

    (Max Planck Institute for Solid State Research)

  • Lutz Bornmann

    (Administrative Headquarters)

Abstract

Methodological mistakes, data errors, and scientific misconduct are considered prevalent problems in science that are often difficult to detect. In this study, we explore the potential of using data from Twitter for discovering problems with publications. In this case study, we analyzed tweet texts of three retracted publications about COVID-19 (Coronavirus disease 2019)/SARS-CoV-2 (severe acute respiratory syndrome coronavirus 2) and their retraction notices. We did not find early warning signs in tweet texts regarding one publication, but we did find tweets that casted doubt on the validity of the two other publications shortly after their publication date. An extension of our current work might lead to an early warning system that makes the scientific community aware of problems with certain publications. Other sources, such as blogs or post-publication peer-review sites, could be included in such an early warning system. The methodology proposed in this case study should be validated using larger publication sets that also include a control group, i.e., publications that were not retracted.

Suggested Citation

  • Robin Haunschild & Lutz Bornmann, 2021. "Can tweets be used to detect problems early with scientific papers? A case study of three retracted COVID-19/SARS-CoV-2 papers," Scientometrics, Springer;Akadémiai Kiadó, vol. 126(6), pages 5181-5199, June.
  • Handle: RePEc:spr:scient:v:126:y:2021:i:6:d:10.1007_s11192-021-03962-7
    DOI: 10.1007/s11192-021-03962-7
    as

    Download full text from publisher

    File URL: http://link.springer.com/10.1007/s11192-021-03962-7
    File Function: Abstract
    Download Restriction: Access to the full text of the articles in this series is restricted.

    File URL: https://libkey.io/10.1007/s11192-021-03962-7?utm_source=ideas
    LibKey link: if access is restricted and if your library uses this service, LibKey will redirect you to where you can use your library subscription to access this item
    ---><---

    As the access to this document is restricted, you may want to search for a different version of it.

    References listed on IDEAS

    as
    1. Feinerer, Ingo & Hornik, Kurt & Meyer, David, 2008. "Text Mining Infrastructure in R," Journal of Statistical Software, Foundation for Open Access Statistics, vol. 25(i05).
    2. Qin Zhang & Juneman Abraham & Hui-Zhen Fu, 2020. "Collaboration and its influence on retraction based on retracted publications during 1978–2017," Scientometrics, Springer;Akadémiai Kiadó, vol. 125(1), pages 213-232, October.
    3. Lutz Bornmann & Robin Haunschild, 2018. "Allegation of scientific misconduct increases Twitter attention," Scientometrics, Springer;Akadémiai Kiadó, vol. 115(2), pages 1097-1100, May.
    4. Rodrigo Costas & Zohreh Zahedi & Paul Wouters, 2015. "Do “altmetrics” correlate with citations? Extensive comparison of altmetric indicators with citations from a multidisciplinary perspective," Journal of the Association for Information Science & Technology, Association for Information Science & Technology, vol. 66(10), pages 2003-2019, October.
    5. Sergio Copiello, 2020. "Other than detecting impact in advance, alternative metrics could act as early warning signs of retractions: tentative findings of a study into the papers retracted by PLoS ONE," Scientometrics, Springer;Akadémiai Kiadó, vol. 125(3), pages 2449-2469, December.
    6. Julia Vainio & Kim Holmberg, 2017. "Highly tweeted science articles: who tweets them? An analysis of Twitter user profile descriptions," Scientometrics, Springer;Akadémiai Kiadó, vol. 112(1), pages 345-366, July.
    7. Mohammadamin Erfanmanesh & Jaime A. Teixeira da Silva, 2019. "Is the soundness-only quality control policy of open access mega journals linked to a higher rate of published errors?," Scientometrics, Springer;Akadémiai Kiadó, vol. 120(2), pages 917-923, August.
    8. Lutz Bornmann, 2013. "Research Misconduct—Definitions, Manifestations and Extent," Publications, MDPI, vol. 1(3), pages 1-12, October.
    Full references (including those not matched with items on IDEAS)

    Citations

    Citations are extracted by the CitEc Project, subscribe to its RSS feed for this item.
    as


    Cited by:

    1. Salim Moussa, 2022. "The propagation of error: retracted articles in marketing and their citations," Italian Journal of Marketing, Springer, vol. 2022(1), pages 11-36, March.
    2. Constantin Bürgi & Klaus Wohlrabe, 2022. "The influence of Covid-19 on publications in economics: bibliometric evidence from five working paper series," Scientometrics, Springer;Akadémiai Kiadó, vol. 127(9), pages 5175-5189, September.

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Behzad Gholampour & Sajad Gholampour & Alireza Noruzi & Clément Arsenault & Thomas Haertlé & Ali Akbar Saboury, 2022. "Retracted articles in oncology in the last three decades: frequency, reasons, and themes," Scientometrics, Springer;Akadémiai Kiadó, vol. 127(4), pages 1841-1865, April.
    2. Yu, Houqiang & Xiao, Tingting & Xu, Shenmeng & Wang, Yuefen, 2019. "Who posts scientific tweets? An investigation into the productivity, locations, and identities of scientific tweeters," Journal of Informetrics, Elsevier, vol. 13(3), pages 841-855.
    3. Fei Shu & Wen Lou & Stefanie Haustein, 2018. "Can Twitter increase the visibility of Chinese publications?," Scientometrics, Springer;Akadémiai Kiadó, vol. 116(1), pages 505-519, July.
    4. Anwar Said & Timothy D. Bowman & Rabeeh Ayaz Abbasi & Naif Radi Aljohani & Saeed-Ul Hassan & Raheel Nawaz, 2019. "Mining network-level properties of Twitter altmetrics data," Scientometrics, Springer;Akadémiai Kiadó, vol. 120(1), pages 217-235, July.
    5. Zhichao Fang & Rodrigo Costas & Paul Wouters, 2022. "User engagement with scholarly tweets of scientific papers: a large-scale and cross-disciplinary analysis," Scientometrics, Springer;Akadémiai Kiadó, vol. 127(8), pages 4523-4546, August.
    6. Sergio Copiello, 2020. "Other than detecting impact in advance, alternative metrics could act as early warning signs of retractions: tentative findings of a study into the papers retracted by PLoS ONE," Scientometrics, Springer;Akadémiai Kiadó, vol. 125(3), pages 2449-2469, December.
    7. Núria Bautista-Puig & Daniela De Filippo & Elba Mauleón & Elías Sanz-Casado, 2019. "Scientific Landscape of Citizen Science Publications: Dynamics, Content and Presence in Social Media," Publications, MDPI, vol. 7(1), pages 1-22, February.
    8. Grinis, Inna, 2017. "The STEM requirements of "non-STEM" jobs: evidence from UK online vacancy postings and implications for skills & knowledge shortages," LSE Research Online Documents on Economics 85123, London School of Economics and Political Science, LSE Library.
    9. Sjoerd Halem & Eeske Roekel & Jaap Denissen, 2024. "Understanding the Dynamics of Hedonic and Eudaimonic Motives on Daily Well-Being: Insights from Experience Sampling Data," Journal of Happiness Studies, Springer, vol. 25(7), pages 1-25, October.
    10. Julia Bachtrögler & Christoph Hammer & Wolf Heinrich Reuter & Florian Schwendinger, 2019. "Guide to the galaxy of EU regional funds recipients: evidence from new data," Empirica, Springer;Austrian Institute for Economic Research;Austrian Economic Association, vol. 46(1), pages 103-150, February.
    11. Metwaly Ali Mohamed Eldakar, 2019. "Who reads international Egyptian academic articles? An altmetrics analysis of Mendeley readership categories," Scientometrics, Springer;Akadémiai Kiadó, vol. 121(1), pages 105-135, October.
    12. Ying Guo & Xiantao Xiao, 2022. "Author-level altmetrics for the evaluation of Chinese scholars," Scientometrics, Springer;Akadémiai Kiadó, vol. 127(2), pages 973-990, February.
    13. Shuyue Huang & Lena Jingen Liang & Hwansuk Chris Choi, 2022. "How We Failed in Context: A Text-Mining Approach to Understanding Hotel Service Failures," Sustainability, MDPI, vol. 14(5), pages 1-18, February.
    14. Laura Anderlucci & Cinzia Viroli, 2020. "Mixtures of Dirichlet-Multinomial distributions for supervised and unsupervised classification of short text data," Advances in Data Analysis and Classification, Springer;German Classification Society - Gesellschaft für Klassifikation (GfKl);Japanese Classification Society (JCS);Classification and Data Analysis Group of the Italian Statistical Society (CLADAG);International Federation of Classification Societies (IFCS), vol. 14(4), pages 759-770, December.
    15. Stefano Sbalchiero & Maciej Eder, 2020. "Topic modeling, long texts and the best number of topics. Some Problems and solutions," Quality & Quantity: International Journal of Methodology, Springer, vol. 54(4), pages 1095-1108, August.
    16. Daniela De Filippo & Fernanda Morillo & Borja González-Albo, 2023. "Measuring the Impact and Influence of Scientific Activity in the Humanities and Social Sciences," Publications, MDPI, vol. 11(2), pages 1-17, June.
    17. Jianhua Hou & Xiucai Yang & Yang Zhang, 2023. "The effect of social media knowledge cascade: an analysis of scientific papers diffusion," Scientometrics, Springer;Akadémiai Kiadó, vol. 128(9), pages 5169-5195, September.
    18. Thelwall, Mike, 2018. "Dimensions: A competitor to Scopus and the Web of Science?," Journal of Informetrics, Elsevier, vol. 12(2), pages 430-435.
    19. Yaxue Ma & Zhichao Ba & Yuxiang Zhao & Jin Mao & Gang Li, 2021. "Understanding and predicting the dissemination of scientific papers on social media: a two-step simultaneous equation modeling–artificial neural network approach," Scientometrics, Springer;Akadémiai Kiadó, vol. 126(8), pages 7051-7085, August.
    20. Daoud, Adel & Kohl, Sebastian, 2016. "How much do sociologists write about economic topics? Using big data to test some conventional views in economic sociology, 1890 to 2014," MPIfG Discussion Paper 16/7, Max Planck Institute for the Study of Societies.

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:spr:scient:v:126:y:2021:i:6:d:10.1007_s11192-021-03962-7. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: Sonal Shukla or Springer Nature Abstracting and Indexing (email available below). General contact details of provider: http://www.springer.com .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.