IDEAS home Printed from https://ideas.repec.org/a/plo/pone00/0024357.html
   My bibliography  Save this article

Public Availability of Published Research Data in High-Impact Journals

Author

Listed:
  • Alawi A Alsheikh-Ali
  • Waqas Qureshi
  • Mouaz H Al-Mallah
  • John P A Ioannidis

Abstract

Background: There is increasing interest to make primary data from published research publicly available. We aimed to assess the current status of making research data available in highly-cited journals across the scientific literature. Methods and Results: We reviewed the first 10 original research papers of 2009 published in the 50 original research journals with the highest impact factor. For each journal we documented the policies related to public availability and sharing of data. Of the 50 journals, 44 (88%) had a statement in their instructions to authors related to public availability and sharing of data. However, there was wide variation in journal requirements, ranging from requiring the sharing of all primary data related to the research to just including a statement in the published manuscript that data can be available on request. Of the 500 assessed papers, 149 (30%) were not subject to any data availability policy. Of the remaining 351 papers that were covered by some data availability policy, 208 papers (59%) did not fully adhere to the data availability instructions of the journals they were published in, most commonly (73%) by not publicly depositing microarray data. The other 143 papers that adhered to the data availability instructions did so by publicly depositing only the specific data type as required, making a statement of willingness to share, or actually sharing all the primary data. Overall, only 47 papers (9%) deposited full primary raw data online. None of the 149 papers not subject to data availability policies made their full primary data publicly available. Conclusion: A substantial proportion of original research papers published in high-impact journals are either not subject to any data availability policies, or do not adhere to the data availability instructions in their respective journals. This empiric evaluation highlights opportunities for improvement.

Suggested Citation

  • Alawi A Alsheikh-Ali & Waqas Qureshi & Mouaz H Al-Mallah & John P A Ioannidis, 2011. "Public Availability of Published Research Data in High-Impact Journals," PLOS ONE, Public Library of Science, vol. 6(9), pages 1-4, September.
  • Handle: RePEc:plo:pone00:0024357
    DOI: 10.1371/journal.pone.0024357
    as

    Download full text from publisher

    File URL: https://journals.plos.org/plosone/article?id=10.1371/journal.pone.0024357
    Download Restriction: no

    File URL: https://journals.plos.org/plosone/article/file?id=10.1371/journal.pone.0024357&type=printable
    Download Restriction: no

    File URL: https://libkey.io/10.1371/journal.pone.0024357?utm_source=ideas
    LibKey link: if access is restricted and if your library uses this service, LibKey will redirect you to where you can use your library subscription to access this item
    ---><---

    References listed on IDEAS

    as
    1. Keith Baggerly, 2010. "Disclose all data in publications," Nature, Nature, vol. 467(7314), pages 401-401, September.
    2. Andreas Lundh & Marija Barbateskovic & Asbjørn Hróbjartsson & Peter C Gøtzsche, 2010. "Conflicts of Interest at Medical Journals: The Influence of Industry-Supported Randomised Trials on Journal Impact Factors and Revenue – Cohort Study," PLOS Medicine, Public Library of Science, vol. 7(10), pages 1-7, October.
    3. Neal S Young, 2008. "Why Current Publication May Distort Science," Working Papers id:1757, eSocialSciences.
    4. Neal S Young & John P A Ioannidis & Omar Al-Ubaydli, 2008. "Why Current Publication Practices May Distort Science," PLOS Medicine, Public Library of Science, vol. 5(10), pages 1-5, October.
    Full references (including those not matched with items on IDEAS)

    Citations

    Citations are extracted by the CitEc Project, subscribe to its RSS feed for this item.
    as


    Cited by:

    1. Zeng, Tong & Wu, Longfeng & Bratt, Sarah & Acuna, Daniel E., 2020. "Assigning credit to scientific datasets using article citation networks," Journal of Informetrics, Elsevier, vol. 14(2).
    2. Rut Lucas-Dominguez & Adolfo Alonso-Arroyo & Antonio Vidal-Infer & Rafael Aleixandre-Benavent, 2021. "The sharing of research data facing the COVID-19 pandemic," Scientometrics, Springer;Akadémiai Kiadó, vol. 126(6), pages 4975-4990, June.
    3. Sixto-Costoya Andrea & Robinson-Garcia Nicolas & Leeuwen Thed & Costas Rodrigo, 2021. "Exploring the relevance of ORCID as a source of study of data sharing activities at the individual-level: a methodological discussion," Scientometrics, Springer;Akadémiai Kiadó, vol. 126(8), pages 7149-7165, August.
    4. Schweinsberg, Martin & Feldman, Michael & Staub, Nicola & van den Akker, Olmo R. & van Aert, Robbie C.M. & van Assen, Marcel A.L.M. & Liu, Yang & Althoff, Tim & Heer, Jeffrey & Kale, Alex & Mohamed, Z, 2021. "Same data, different conclusions: Radical dispersion in empirical results when independent analysts operationalize and test the same hypothesis," Organizational Behavior and Human Decision Processes, Elsevier, vol. 165(C), pages 228-249.
    5. Stephanie B Linek & Benedikt Fecher & Sascha Friesike & Marcel Hebing, 2017. "Data sharing as social dilemma: Influence of the researcher’s personality," PLOS ONE, Public Library of Science, vol. 12(8), pages 1-24, August.

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Omar Al-Ubaydli & John List & Claire Mackevicius & Min Sok Lee & Dana Suskind, 2019. "How Can Experiments Play a Greater Role in Public Policy? 12 Proposals from an Economic Model of Scaling," Artefactual Field Experiments 00679, The Field Experiments Website.
    2. Omar Al-Ubaydli & John A. List, 2015. "Do Natural Field Experiments Afford Researchers More or Less Control than Laboratory Experiments? A Simple Model," NBER Working Papers 20877, National Bureau of Economic Research, Inc.
    3. Kirmayer, Laurence J., 2012. "Cultural competence and evidence-based practice in mental health: Epistemic communities and the politics of pluralism," Social Science & Medicine, Elsevier, vol. 75(2), pages 249-256.
    4. Chris Doucouliagos & T.D. Stanley, 2013. "Are All Economic Facts Greatly Exaggerated? Theory Competition And Selectivity," Journal of Economic Surveys, Wiley Blackwell, vol. 27(2), pages 316-339, April.
    5. Matthew L. Wallace & Ismael Rafols, 2016. "Shaping the Agenda of a Grand Challenge: Institutional Mediation of Priorities in Avian Influenza Research," SPRU Working Paper Series 2016-02, SPRU - Science Policy Research Unit, University of Sussex Business School.
    6. Wallace, Matthew L. & Ràfols, Ismael, 2018. "Institutional shaping of research priorities: A case study on avian influenza," Research Policy, Elsevier, vol. 47(10), pages 1975-1989.
    7. Omar Al-Ubaydli & John List & Dana Suskind, 2019. "The science of using science: Towards an understanding of the threats to scaling experiments," Artefactual Field Experiments 00670, The Field Experiments Website.
    8. Stephan B. Bruns, 2013. "Identifying Genuine Effects in Observational Research by Means of Meta-Regressions," Jena Economics Research Papers 2013-040, Friedrich-Schiller-University Jena.
    9. Judith G M Bergboer & Maša Umićević-Mirkov & Jaap Fransen & Martin den Heijer & Barbara Franke & Piet L C M van Riel & Joost Schalkwijk & Marieke J H Coenen & on behalf of the Nijmegen Biomedical Stud, 2012. "A Replication Study of the Association between Rheumatoid Arthritis and Deletion of the Late Cornified Envelope Genes LCE3B and LCE3C," PLOS ONE, Public Library of Science, vol. 7(2), pages 1-5, February.
    10. Omar Al-Ubaydli & John A. List & Dana L. Suskind, 2017. "What Can We Learn from Experiments? Understanding the Threats to the Scalability of Experimental Results," American Economic Review, American Economic Association, vol. 107(5), pages 282-286, May.
    11. Brian P Walcott & Sameer A Sheth & Brian V Nahed & Jean-Valery Coumans, 2012. "Conflict of Interest in Spine Research Reporting," PLOS ONE, Public Library of Science, vol. 7(8), pages 1-4, August.
    12. Mark D Lindner & Richard K Nakamura, 2015. "Examining the Predictive Validity of NIH Peer Review Scores," PLOS ONE, Public Library of Science, vol. 10(6), pages 1-12, June.
    13. Mangirdas Morkunas & Elzė Rudienė & Lukas Giriūnas & Laura Daučiūnienė, 2020. "Assessment of Factors Causing Bias in Marketing- Related Publications," Publications, MDPI, vol. 8(4), pages 1-16, October.
    14. Gregory S Barsh & Gregory P Copenhaver, 2009. "Scientists←Editors←Scientists: The Past, Present, and Future of PLoS Genetics," PLOS Genetics, Public Library of Science, vol. 5(7), pages 1-2, July.
    15. Boomsma, Mirthe, 2021. "On the transition to a sustainable economy : Field experimental evidence on behavioral interventions," Other publications TiSEM a0a27602-10ed-4ab1-87a5-5, Tilburg University, School of Economics and Management.
    16. Mitesh Kataria, 2010. "The Role of Preferences in Disagreements over Scientific Hypothesis: An Empirical Inquiry into Environmental and Economic Decision Making," Jena Economics Research Papers 2010-088, Friedrich-Schiller-University Jena.
    17. Nelson, Jon P., 2014. "Estimating the price elasticity of beer: Meta-analysis of data with heterogeneity, dependence, and publication bias," Journal of Health Economics, Elsevier, vol. 33(C), pages 180-187.
    18. T.D. Stanley, 2013. "Does economics add up? An introduction to meta-regression analysis," European Journal of Economics and Economic Policies: Intervention, Edward Elgar Publishing, vol. 10(2), pages 207-220.
    19. Christine R Harris & Noriko Coburn & Doug Rohrer & Harold Pashler, 2013. "Two Failures to Replicate High-Performance-Goal Priming Effects," PLOS ONE, Public Library of Science, vol. 8(8), pages 1-1, August.
    20. Martin Paldam, 2016. "Simulating an empirical paper by the rational economist," Empirical Economics, Springer, vol. 50(4), pages 1383-1407, June.

    More about this item

    Statistics

    Access and download statistics

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:plo:pone00:0024357. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: plosone (email available below). General contact details of provider: https://journals.plos.org/plosone/ .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.