IDEAS home Printed from https://ideas.repec.org/p/osf/osfxxx/gv25c.html
   My bibliography  Save this paper

The Effect of Publication Bias on the Assessment of Heterogeneity

Author

Listed:
  • Augusteijn, Hilde
  • van Aert, Robbie Cornelis Maria
  • van Assen, Marcel A. L. M.

Abstract

One of the main goals of meta-analysis is to test and estimate the heterogeneity of effect size. We examined the effect of publication bias on the Q-test and assessments of heterogeneity, as a function of true heterogeneity, publication bias, true effect size, number of studies, and variation of sample sizes. The expected values of heterogeneity measures H2 and I2 were analytically derived, and the power and the type I error rate of the Q-test were examined in a Monte-Carlo simulation study. Our results show that the effect of publication bias on the Q-test and assessment of heterogeneity is large, complex, and non-linear. Publication bias can both dramatically decrease and increase heterogeneity. Extreme homogeneity can occur even when the population heterogeneity is large. Particularly if the number of studies is large and population effect size is small, publication bias can cause both extreme type I error rates and power of the Q-test close to 0 or 1. We therefore conclude that the Q-test of homogeneity and heterogeneity measures H2 and I2 are generally not valid in assessing and testing heterogeneity when publication bias is present, especially when the true effect size is small and the number of studies is large. We introduce a web application, Q-sense, which can be used to assess the sensitivity of the Q-test to publication bias, and we apply it to two published meta-analysis. Meta-analytic methods should be enhanced in order to be able to deal with publication bias in their assessment and tests of heterogeneity.

Suggested Citation

  • Augusteijn, Hilde & van Aert, Robbie Cornelis Maria & van Assen, Marcel A. L. M., 2017. "The Effect of Publication Bias on the Assessment of Heterogeneity," OSF Preprints gv25c, Center for Open Science.
  • Handle: RePEc:osf:osfxxx:gv25c
    DOI: 10.31219/osf.io/gv25c
    as

    Download full text from publisher

    File URL: https://osf.io/download/594251f96c613b022aa0d041/
    Download Restriction: no

    File URL: https://libkey.io/10.31219/osf.io/gv25c?utm_source=ideas
    LibKey link: if access is restricted and if your library uses this service, LibKey will redirect you to where you can use your library subscription to access this item
    ---><---

    References listed on IDEAS

    as
    1. Michal Kicinski, 2013. "Publication Bias in Recent Meta-Analyses," PLOS ONE, Public Library of Science, vol. 8(11), pages 1-1, November.
    2. André L A Rabelo & Victor N Keller & Ronaldo Pilati & Jelte M Wicherts, 2015. "No Effect of Weight on Judgments of Importance in the Moral Domain and Evidence of Publication Bias from a Meta-Analysis," PLOS ONE, Public Library of Science, vol. 10(8), pages 1-15, August.
    3. Kristian Thorlund & Georgina Imberger & Bradley C Johnston & Michael Walsh & Tahany Awad & Lehana Thabane & Christian Gluud & P J Devereaux & Jørn Wetterslev, 2012. "Evolution of Heterogeneity (I2) Estimates and Their 95% Confidence Intervals in Large Meta-Analyses," PLOS ONE, Public Library of Science, vol. 7(7), pages 1-8, July.
    4. Daniele Fanelli, 2012. "Negative results are disappearing from most disciplines and countries," Scientometrics, Springer;Akadémiai Kiadó, vol. 90(3), pages 891-904, March.
    5. Wolfgang Viechtbauer, 2007. "Approximate Confidence Intervals for Standardized Effect Sizes in the Two-Independent and Two-Dependent Samples Design," Journal of Educational and Behavioral Statistics, , vol. 32(1), pages 39-60, March.
    6. Jaime L. Peters & Alex J. Sutton & David R. Jones & Keith R. Abrams & Lesley Rushton & Santiago G. Moreno, 2010. "Assessing publication bias in meta‐analyses in the presence of between‐study heterogeneity," Journal of the Royal Statistical Society Series A, Royal Statistical Society, vol. 173(3), pages 575-591, July.
    7. Megan L Head & Luke Holman & Rob Lanfear & Andrew T Kahn & Michael D Jennions, 2015. "The Extent and Consequences of P-Hacking in Science," PLOS Biology, Public Library of Science, vol. 13(3), pages 1-15, March.
    8. Dan Jackson, 2007. "Assessing the Implications of Publication Bias for Two Popular Estimates of between-Study Variance in Meta-Analysis," Biometrics, The International Biometric Society, vol. 63(1), pages 187-193, March.
    Full references (including those not matched with items on IDEAS)

    Citations

    Citations are extracted by the CitEc Project, subscribe to its RSS feed for this item.
    as


    Cited by:

    1. Ivan Ropovik & Matus Adamkovic & David Greger, 2021. "Neglect of publication bias compromises meta-analyses of educational research," PLOS ONE, Public Library of Science, vol. 16(6), pages 1-14, June.

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Augusteijn, Hilde Elisabeth Maria & van Aert, Robbie Cornelis Maria & van Assen, Marcel A. L. M., 2021. "Posterior Probabilities of Effect Sizes and Heterogeneity in Meta-Analysis: An Intuitive Approach of Dealing with Publication Bias," OSF Preprints avkgj, Center for Open Science.
    2. Pierre J C Chuard & Milan Vrtílek & Megan L Head & Michael D Jennions, 2019. "Evidence that nonsignificant results are sometimes preferred: Reverse P-hacking or selective reporting?," PLOS Biology, Public Library of Science, vol. 17(1), pages 1-7, January.
    3. Dan Jackson, 2018. "Discussion on Quantifying publication bias in meta‐analysis," Biometrics, The International Biometric Society, vol. 74(3), pages 795-796, September.
    4. van Aert, Robbie Cornelis Maria & van Assen, Marcel A. L. M., 2018. "P-uniform," MetaArXiv zqjr9, Center for Open Science.
    5. Christian Harlos & Tim C. Edgell & Johan Hollander, 2017. "No evidence of publication bias in climate change science," Climatic Change, Springer, vol. 140(3), pages 375-385, February.
    6. Robbie C M van Aert & Jelte M Wicherts & Marcel A L M van Assen, 2019. "Publication bias examined in meta-analyses from psychology and medicine: A meta-meta-analysis," PLOS ONE, Public Library of Science, vol. 14(4), pages 1-32, April.
    7. Piotr Bialowolski & Dorota Weziak-Bialowolska & Eileen McNeely, 2021. "The Role of Financial Fragility and Financial Control for Well-Being," Social Indicators Research: An International and Interdisciplinary Journal for Quality-of-Life Measurement, Springer, vol. 155(3), pages 1137-1157, June.
    8. Tian, Dan & Hu, Xiao & Qian, Yuchen & Li, Jiang, 2024. "Exploring the scientific impact of negative results," Journal of Informetrics, Elsevier, vol. 18(1).
    9. Karin Langenkamp & Bodo Rödel & Kerstin Taufenbach & Meike Weiland, 2018. "Open Access in Vocational Education and Training Research," Publications, MDPI, vol. 6(3), pages 1-12, July.
    10. Wenzhi Wang & Yumin Hu & Peiou Lu & Yingci Li & Yunfu Chen & Mohan Tian & Lijuan Yu, 2014. "Evaluation of the Diagnostic Performance of Magnetic Resonance Spectroscopy in Brain Tumors: A Systematic Review and Meta-Analysis," PLOS ONE, Public Library of Science, vol. 9(11), pages 1-11, November.
    11. Abel Brodeur, Nikolai M. Cook, Anthony Heyes, 2022. "We Need to Talk about Mechanical Turk: What 22,989 Hypothesis Tests Tell Us about Publication Bias and p-Hacking in Online Experiments," LCERPA Working Papers am0133, Laurier Centre for Economic Research and Policy Analysis.
    12. Jasper Brinkerink, 2023. "When Shooting for the Stars Becomes Aiming for Asterisks: P-Hacking in Family Business Research," Entrepreneurship Theory and Practice, , vol. 47(2), pages 304-343, March.
    13. Arnaud Vaganay, 2016. "Cluster Sampling Bias in Government-Sponsored Evaluations: A Correlational Study of Employment and Welfare Pilots in England," PLOS ONE, Public Library of Science, vol. 11(8), pages 1-21, August.
    14. Lifeng Lin & Haitao Chu, 2018. "Rejoinder to “quantifying publication bias in meta‐analysis”," Biometrics, The International Biometric Society, vol. 74(3), pages 801-802, September.
    15. Johannes Hönekopp & Audrey Helen Linden, 2022. "Heterogeneity estimates in a biased world," PLOS ONE, Public Library of Science, vol. 17(2), pages 1-21, February.
    16. Rosa Lavelle-Hill & Gavin Smith & Anjali Mazumder & Todd Landman & James Goulding, 2021. "Machine learning methods for “wicked” problems: exploring the complex drivers of modern slavery," Palgrave Communications, Palgrave Macmillan, vol. 8(1), pages 1-11, December.
    17. David Winkelmann & Marius Ötting & Christian Deutscher & Tomasz Makarewicz, 2024. "Are Betting Markets Inefficient? Evidence From Simulations and Real Data," Journal of Sports Economics, , vol. 25(1), pages 54-97, January.
    18. Vitor Azevedo & Christopher Hoegner, 2023. "Enhancing stock market anomalies with machine learning," Review of Quantitative Finance and Accounting, Springer, vol. 60(1), pages 195-230, January.
    19. Brian Fabo & Martina Jancokova & Elisabeth Kempf & Lubos Pastor, 2020. "Fifty Shades of QE: Conflicts of Interest in Economic Research," Working Papers 2020-128, Becker Friedman Institute for Research In Economics.
    20. Rinne, Sonja, 2024. "Estimating the merit-order effect using coarsened exact matching: Reconciling theory with the empirical results to improve policy implications," Energy Policy, Elsevier, vol. 185(C).

    More about this item

    Statistics

    Access and download statistics

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:osf:osfxxx:gv25c. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: OSF (email available below). General contact details of provider: https://osf.io/preprints/ .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.