IDEAS home Printed from https://ideas.repec.org/p/osf/metaar/tp45u.html
   My bibliography  Save this paper

Using Selection Models to Assess Sensitivity to Publication Bias: A Tutorial and Call for More Routine Use

Author

Listed:
  • Maier, Maximilian
  • VanderWeele, Tyler
  • Mathur, Maya B

Abstract

In meta-analyses, it is critical to assess the extent to which publication bias might have compromised the results. Classical methods based on the funnel plot, including Egger’s test and Trim-and-Fill, have become the de facto default methods to do so, with a large majority of recent meta-analyses in top medical journals (85%) assessing for publication bias exclusively using these methods. However, these classical funnel plot methods have important limitations when used as the sole means of assessing publication bias: they essentially assume that the publication process favors large point estimates for small studies and does not affect the largest studies, and they can perform poorly when effects are heterogeneous. In light of these limitations, we recommend that meta-analyses routinely apply other publication bias methods in addition to or instead of classical funnel plot methods. To this end, we describe how to use and interpret selection models. These methods make the often more realistic assumption that publication bias favors ``statistically significant'' results and that also directly accommodate effect heterogeneity. Selection models are well-established in the statistics literature and are supported by user-friendly software, yet remain rarely reported in many disciplines. We use previously published meta-analyses to demonstrate that selection models can yield insights that extend beyond those provided by funnel plot methods, suggesting the importance of establishing more comprehensive reporting practices for publication bias assessment.

Suggested Citation

  • Maier, Maximilian & VanderWeele, Tyler & Mathur, Maya B, 2021. "Using Selection Models to Assess Sensitivity to Publication Bias: A Tutorial and Call for More Routine Use," MetaArXiv tp45u, Center for Open Science.
  • Handle: RePEc:osf:metaar:tp45u
    DOI: 10.31219/osf.io/tp45u
    as

    Download full text from publisher

    File URL: https://osf.io/download/612771db38595800692ed8c9/
    Download Restriction: no

    File URL: https://libkey.io/10.31219/osf.io/tp45u?utm_source=ideas
    LibKey link: if access is restricted and if your library uses this service, LibKey will redirect you to where you can use your library subscription to access this item
    ---><---

    References listed on IDEAS

    as
    1. Blakeley B. McShane & David Gal, 2017. "Rejoinder: Statistical Significance and the Dichotomization of Evidence," Journal of the American Statistical Association, Taylor & Francis Journals, vol. 112(519), pages 904-908, July.
    2. Maya B. Mathur & Tyler J. VanderWeele, 2020. "Sensitivity analysis for publication bias in meta‐analyses," Journal of the Royal Statistical Society Series C, Royal Statistical Society, vol. 69(5), pages 1091-1119, November.
    3. Viechtbauer, Wolfgang, 2010. "Conducting Meta-Analyses in R with the metafor Package," Journal of Statistical Software, Foundation for Open Access Statistics, vol. 36(i03).
    4. Larose, Daniel T. & Dey, Dipak K., 1998. "Modeling publication bias using weighted distributions in a Bayesian framework," Computational Statistics & Data Analysis, Elsevier, vol. 26(3), pages 279-302, January.
    5. Blakeley B. McShane & David Gal, 2017. "Statistical Significance and the Dichotomization of Evidence," Journal of the American Statistical Association, Taylor & Francis Journals, vol. 112(519), pages 885-895, July.
    Full references (including those not matched with items on IDEAS)

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Maximilian Maier & Tyler J. VanderWeele & Maya B. Mathur, 2022. "Using selection models to assess sensitivity to publication bias: A tutorial and call for more routine use," Campbell Systematic Reviews, John Wiley & Sons, vol. 18(3), September.
    2. Furukawa, Chishio, 2019. "Publication Bias under Aggregation Frictions: Theory, Evidence, and a New Correction Method," EconStor Preprints 194798, ZBW - Leibniz Information Centre for Economics.
    3. Bertoldi, Paolo & Mosconi, Rocco, 2020. "Do energy efficiency policies save energy? A new approach based on energy policy indicators (in the EU Member States)," Energy Policy, Elsevier, vol. 139(C).
    4. Anderson, Brian S. & Wennberg, Karl & McMullen, Jeffery S., 2019. "Editorial: Enhancing quantitative theory-testing entrepreneurship research," Journal of Business Venturing, Elsevier, vol. 34(5), pages 1-1.
    5. Wennberg, Karl & Anderson, Brian S. & McMullen, Jeffrey, 2019. "2 Editorial: Enhancing Quantitative Theory-Testing Entrepreneurship Research," Ratio Working Papers 323, The Ratio Institute.
    6. Maya B. Mathur & Tyler J. VanderWeele, 2020. "Sensitivity analysis for publication bias in meta‐analyses," Journal of the Royal Statistical Society Series C, Royal Statistical Society, vol. 69(5), pages 1091-1119, November.
    7. David J. Hand, 2022. "Trustworthiness of statistical inference," Journal of the Royal Statistical Society Series A, Royal Statistical Society, vol. 185(1), pages 329-347, January.
    8. Anderson, Brian S., 2022. "What executives get wrong about statistics: Moving from statistical significance to effect sizes and practical impact," Business Horizons, Elsevier, vol. 65(3), pages 379-388.
    9. van Aert, Robbie Cornelis Maria & van Assen, Marcel A. L. M., 2018. "P-uniform," MetaArXiv zqjr9, Center for Open Science.
    10. J. M. Bauer & L. A. Reisch, 2019. "Behavioural Insights and (Un)healthy Dietary Choices: a Review of Current Evidence," Journal of Consumer Policy, Springer, vol. 42(1), pages 3-45, March.
    11. Jeffrey A. Mills & Gary Cornwall & Beau A. Sauley & Jeffrey R. Strawn, 2018. "Improving the Analysis of Randomized Controlled Trials: a Posterior Simulation Approach," BEA Working Papers 0157, Bureau of Economic Analysis.
    12. Han Wang & Sieglinde S Snapp & Monica Fisher & Frederi Viens, 2019. "A Bayesian analysis of longitudinal farm surveys in Central Malawi reveals yield determinants and site-specific management strategies," PLOS ONE, Public Library of Science, vol. 14(8), pages 1-17, August.
    13. Tom Engsted, 2024. "What Is the False Discovery Rate in Empirical Research?," Econ Journal Watch, Econ Journal Watch, vol. 21(1), pages 1-92–112, March.
    14. van Aert, Robbie Cornelis Maria, 2018. "Dissertation R.C.M. van Aert," MetaArXiv eqhjd, Center for Open Science.
    15. Luigi Pace & Alessandra Salvan, 2020. "Likelihood, Replicability and Robbins' Confidence Sequences," International Statistical Review, International Statistical Institute, vol. 88(3), pages 599-615, December.
    16. Glenn Shafer, 2021. "Testing by betting: A strategy for statistical and scientific communication," Journal of the Royal Statistical Society Series A, Royal Statistical Society, vol. 184(2), pages 407-431, April.
    17. Hirschauer Norbert & Grüner Sven & Mußhoff Oliver & Becker Claudia, 2019. "Twenty Steps Towards an Adequate Inferential Interpretation of p-Values in Econometrics," Journal of Economics and Statistics (Jahrbuecher fuer Nationaloekonomie und Statistik), De Gruyter, vol. 239(4), pages 703-721, August.
    18. Sadri, Arash, 2022. "The Ultimate Cause of the “Reproducibility Crisis”: Reductionist Statistics," MetaArXiv yxba5, Center for Open Science.
    19. Strømland, Eirik, 2019. "Preregistration and reproducibility," Journal of Economic Psychology, Elsevier, vol. 75(PA).
    20. Whitney S Beck & Ed K Hall, 2018. "Confounding factors in algal phosphorus limitation experiments," PLOS ONE, Public Library of Science, vol. 13(10), pages 1-19, October.

    More about this item

    NEP fields

    This paper has been announced in the following NEP Reports:

    Statistics

    Access and download statistics

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:osf:metaar:tp45u. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: OSF (email available below). General contact details of provider: https://osf.io/preprints/metaarxiv .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.