IDEAS home Printed from https://ideas.repec.org/p/osf/thesis/n5y8b_v1.html
   My bibliography  Save this paper

A new method to explore inferential risks associated with each study in a meta-analysis: An approach based on Design Analysis

Author

Listed:
  • Giorgi, Francesca

Abstract

In the last ten years, scientific research has experienced an unprecedented “credibility’s crisis” of results. This means that researchers couldn't find the same results as in the original ones when conducting replication studies. In fact, the results showed that effects size were often not as strong as in the original studies and sometimes no effect was found. However, an important side-effect of the replicability crisis is that it increased the awareness of the problematic issues in the published literature and it promoted the development of new practices which would guarantee rigour, transparency and reproducibility. In the current work, the aim is to propose a new method to explore the inferential risks associated with each study in a meta-analysis. Specifically, this method is based on Design Analysis, a power analysis approach developed by @gelmanPowerCalculationsAssessing2014, which allows to analyse two other type of errors that are not commonly considered: the Type M (Magnitude) error and the Type S (Sign) error, concerning the magnitude and direction of the effects. We chose the Design Analysis approach because it allows to put more emphasis on the estimate of the effect size and it can be a valid tool available to researchers to make more conscious and informed decisions.

Suggested Citation

  • Giorgi, Francesca, 2021. "A new method to explore inferential risks associated with each study in a meta-analysis: An approach based on Design Analysis," Thesis Commons n5y8b_v1, Center for Open Science.
  • Handle: RePEc:osf:thesis:n5y8b_v1
    DOI: 10.31219/osf.io/n5y8b_v1
    as

    Download full text from publisher

    File URL: https://osf.io/download/61c9d1c42962ce0442b010f8/
    Download Restriction: no

    File URL: https://libkey.io/10.31219/osf.io/n5y8b_v1?utm_source=ideas
    LibKey link: if access is restricted and if your library uses this service, LibKey will redirect you to where you can use your library subscription to access this item
    ---><---

    References listed on IDEAS

    as
    1. Jelte M Wicherts & Marjan Bakker & Dylan Molenaar, 2011. "Willingness to Share Research Data Is Related to the Strength of the Evidence and the Quality of Reporting of Statistical Results," PLOS ONE, Public Library of Science, vol. 6(11), pages 1-7, November.
    Full references (including those not matched with items on IDEAS)

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Coosje L S Veldkamp & Michèle B Nuijten & Linda Dominguez-Alvarez & Marcel A L M van Assen & Jelte M Wicherts, 2014. "Statistical Reporting Errors and Collaboration on Statistical Analyses in Psychological Science," PLOS ONE, Public Library of Science, vol. 9(12), pages 1-19, December.
    2. Michal Krawczyk & Ernesto Reuben, 2012. "(Un)Available upon Request: Field Experiment on Researchers' Willingness to Share Supplementary Materials," Natural Field Experiments 00689, The Field Experiments Website.
    3. Pfenninger, Stefan & DeCarolis, Joseph & Hirth, Lion & Quoilin, Sylvain & Staffell, Iain, 2017. "The importance of open data and software: Is energy research lagging behind?," Energy Policy, Elsevier, vol. 101(C), pages 211-215.
    4. Benjamin D K Wood & Rui Müller & Annette N Brown, 2018. "Push button replication: Is impact evaluation evidence for international development verifiable?," PLOS ONE, Public Library of Science, vol. 13(12), pages 1-15, December.
    5. Dominique G Roche & Loeske E B Kruuk & Robert Lanfear & Sandra A Binning, 2015. "Public Data Archiving in Ecology and Evolution: How Well Are We Doing?," PLOS Biology, Public Library of Science, vol. 13(11), pages 1-12, November.
    6. Giorgi, Francesca, 2021. "A new method to explore inferential risks associated with each study in a meta-analysis: An approach based on Design Analysis," Thesis Commons n5y8b, Center for Open Science.
    7. Yulin Yu & Daniel M. Romero, 2024. "Does the Use of Unusual Combinations of Datasets Contribute to Greater Scientific Impact?," Papers 2402.05024, arXiv.org, revised Sep 2024.
    8. Matteo Colombo & Georgi Duev & Michèle B Nuijten & Jan Sprenger, 2018. "Statistical reporting inconsistencies in experimental philosophy," PLOS ONE, Public Library of Science, vol. 13(4), pages 1-12, April.
    9. Esther Maassen & Marcel A L M van Assen & Michèle B Nuijten & Anton Olsson-Collentine & Jelte M Wicherts, 2020. "Reproducibility of individual effect sizes in meta-analyses in psychology," PLOS ONE, Public Library of Science, vol. 15(5), pages 1-18, May.
    10. Irwin Waldman & Scott Lilienfeld, 2016. "Thinking About Data, Research Methods, and Statistical Analyses: Commentary on Sijtsma’s (2014) “Playing with Data”," Psychometrika, Springer;The Psychometric Society, vol. 81(1), pages 16-26, March.
    11. Wicherts, Jelte M. & Veldkamp, Coosje Lisabet Sterre & Augusteijn, Hilde Elisabeth Maria & Bakker, Marjan & van Aert, Robbie Cornelis Maria & van Assen, Marcel A. L. M., 2016. "Degrees of freedom in planning, running, analyzing, and reporting psychological studies A checklist to avoid p-hacking," OSF Preprints umq8d_v1, Center for Open Science.
    12. Ryan P Womack, 2015. "Research Data in Core Journals in Biology, Chemistry, Mathematics, and Physics," PLOS ONE, Public Library of Science, vol. 10(12), pages 1-22, December.
    13. Keiko Kurata & Mamiko Matsubayashi & Shinji Mine, 2017. "Identifying the Complex Position of Research Data and Data Sharing Among Researchers in Natural Science," SAGE Open, , vol. 7(3), pages 21582440177, July.
    14. Kraft-Todd, Gordon T. & Rand, David G., 2021. "Practice what you preach: Credibility-enhancing displays and the growth of open science," Organizational Behavior and Human Decision Processes, Elsevier, vol. 164(C), pages 1-10.
    15. Klaas Sijtsma, 2016. "Playing with Data—Or How to Discourage Questionable Research Practices and Stimulate Researchers to Do Things Right," Psychometrika, Springer;The Psychometric Society, vol. 81(1), pages 1-15, March.
    16. Hensel, Przemysław G., 2021. "Reproducibility and replicability crisis: How management compares to psychology and economics – A systematic review of literature," European Management Journal, Elsevier, vol. 39(5), pages 577-594.
    17. Bennett Kleinberg & Bruno Verschuere, 2015. "Memory Detection 2.0: The First Web-Based Memory Detection Test," PLOS ONE, Public Library of Science, vol. 10(4), pages 1-17, April.
    18. Wicherts, Jelte M. & Veldkamp, Coosje Lisabet Sterre & Augusteijn, Hilde & Bakker, Marjan & van Aert, Robbie Cornelis Maria & van Assen, Marcel A. L. M., 2016. "Degrees of freedom in planning, running, analyzing, and reporting psychological studies A checklist to avoid p-hacking," OSF Preprints umq8d, Center for Open Science.
    19. Antonia Krefeld-Schwalb & Benjamin Scheibehenne, 2023. "Tighter nets for smaller fishes? Mapping the development of statistical practices in consumer research between 2008 and 2020," Marketing Letters, Springer, vol. 34(3), pages 351-365, September.
    20. Peter Pütz & Stephan B. Bruns, 2021. "The (Non‐)Significance Of Reporting Errors In Economics: Evidence From Three Top Journals," Journal of Economic Surveys, Wiley Blackwell, vol. 35(1), pages 348-373, February.

    More about this item

    Statistics

    Access and download statistics

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:osf:thesis:n5y8b_v1. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: OSF (email available below). General contact details of provider: https://thesiscommons.org .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.