IDEAS home Printed from https://ideas.repec.org/p/osf/osfxxx/gv25c.html
   My bibliography  Save this paper

The Effect of Publication Bias on the Assessment of Heterogeneity

Author

Listed:
  • Augusteijn, Hilde
  • van Aert, Robbie Cornelis Maria
  • van Assen, Marcel A. L. M.

Abstract

One of the main goals of meta-analysis is to test and estimate the heterogeneity of effect size. We examined the effect of publication bias on the Q-test and assessments of heterogeneity, as a function of true heterogeneity, publication bias, true effect size, number of studies, and variation of sample sizes. The expected values of heterogeneity measures H2 and I2 were analytically derived, and the power and the type I error rate of the Q-test were examined in a Monte-Carlo simulation study. Our results show that the effect of publication bias on the Q-test and assessment of heterogeneity is large, complex, and non-linear. Publication bias can both dramatically decrease and increase heterogeneity. Extreme homogeneity can occur even when the population heterogeneity is large. Particularly if the number of studies is large and population effect size is small, publication bias can cause both extreme type I error rates and power of the Q-test close to 0 or 1. We therefore conclude that the Q-test of homogeneity and heterogeneity measures H2 and I2 are generally not valid in assessing and testing heterogeneity when publication bias is present, especially when the true effect size is small and the number of studies is large. We introduce a web application, Q-sense, which can be used to assess the sensitivity of the Q-test to publication bias, and we apply it to two published meta-analysis. Meta-analytic methods should be enhanced in order to be able to deal with publication bias in their assessment and tests of heterogeneity.

Suggested Citation

  • Augusteijn, Hilde & van Aert, Robbie Cornelis Maria & van Assen, Marcel A. L. M., 2017. "The Effect of Publication Bias on the Assessment of Heterogeneity," OSF Preprints gv25c, Center for Open Science.
  • Handle: RePEc:osf:osfxxx:gv25c
    DOI: 10.31219/osf.io/gv25c
    as

    Download full text from publisher

    File URL: https://osf.io/download/594251f96c613b022aa0d041/
    Download Restriction: no

    File URL: https://libkey.io/10.31219/osf.io/gv25c?utm_source=ideas
    LibKey link: if access is restricted and if your library uses this service, LibKey will redirect you to where you can use your library subscription to access this item
    ---><---

    References listed on IDEAS

    as
    1. Kristian Thorlund & Georgina Imberger & Bradley C Johnston & Michael Walsh & Tahany Awad & Lehana Thabane & Christian Gluud & P J Devereaux & Jørn Wetterslev, 2012. "Evolution of Heterogeneity (I2) Estimates and Their 95% Confidence Intervals in Large Meta-Analyses," PLOS ONE, Public Library of Science, vol. 7(7), pages 1-8, July.
    2. Jaime L. Peters & Alex J. Sutton & David R. Jones & Keith R. Abrams & Lesley Rushton & Santiago G. Moreno, 2010. "Assessing publication bias in meta‐analyses in the presence of between‐study heterogeneity," Journal of the Royal Statistical Society Series A, Royal Statistical Society, vol. 173(3), pages 575-591, July.
    3. Megan L Head & Luke Holman & Rob Lanfear & Andrew T Kahn & Michael D Jennions, 2015. "The Extent and Consequences of P-Hacking in Science," PLOS Biology, Public Library of Science, vol. 13(3), pages 1-15, March.
    4. Michal Kicinski, 2013. "Publication Bias in Recent Meta-Analyses," PLOS ONE, Public Library of Science, vol. 8(11), pages 1-1, November.
    5. André L A Rabelo & Victor N Keller & Ronaldo Pilati & Jelte M Wicherts, 2015. "No Effect of Weight on Judgments of Importance in the Moral Domain and Evidence of Publication Bias from a Meta-Analysis," PLOS ONE, Public Library of Science, vol. 10(8), pages 1-15, August.
    6. Daniele Fanelli, 2012. "Negative results are disappearing from most disciplines and countries," Scientometrics, Springer;Akadémiai Kiadó, vol. 90(3), pages 891-904, March.
    7. Wolfgang Viechtbauer, 2007. "Approximate Confidence Intervals for Standardized Effect Sizes in the Two-Independent and Two-Dependent Samples Design," Journal of Educational and Behavioral Statistics, , vol. 32(1), pages 39-60, March.
    8. Dan Jackson, 2007. "Assessing the Implications of Publication Bias for Two Popular Estimates of between-Study Variance in Meta-Analysis," Biometrics, The International Biometric Society, vol. 63(1), pages 187-193, March.
    Full references (including those not matched with items on IDEAS)

    Citations

    Citations are extracted by the CitEc Project, subscribe to its RSS feed for this item.
    as


    Cited by:

    1. Ivan Ropovik & Matus Adamkovic & David Greger, 2021. "Neglect of publication bias compromises meta-analyses of educational research," PLOS ONE, Public Library of Science, vol. 16(6), pages 1-14, June.

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Augusteijn, Hilde Elisabeth Maria & van Aert, Robbie Cornelis Maria & van Assen, Marcel A. L. M., 2021. "Posterior Probabilities of Effect Sizes and Heterogeneity in Meta-Analysis: An Intuitive Approach of Dealing with Publication Bias," OSF Preprints avkgj, Center for Open Science.
    2. Pierre J C Chuard & Milan Vrtílek & Megan L Head & Michael D Jennions, 2019. "Evidence that nonsignificant results are sometimes preferred: Reverse P-hacking or selective reporting?," PLOS Biology, Public Library of Science, vol. 17(1), pages 1-7, January.
    3. Dan Jackson, 2018. "Discussion on Quantifying publication bias in meta‐analysis," Biometrics, The International Biometric Society, vol. 74(3), pages 795-796, September.
    4. van Aert, Robbie Cornelis Maria & van Assen, Marcel A. L. M., 2018. "P-uniform," MetaArXiv zqjr9, Center for Open Science.
    5. Christian Harlos & Tim C. Edgell & Johan Hollander, 2017. "No evidence of publication bias in climate change science," Climatic Change, Springer, vol. 140(3), pages 375-385, February.
    6. Robbie C M van Aert & Jelte M Wicherts & Marcel A L M van Assen, 2019. "Publication bias examined in meta-analyses from psychology and medicine: A meta-meta-analysis," PLOS ONE, Public Library of Science, vol. 14(4), pages 1-32, April.
    7. Piotr Bialowolski & Dorota Weziak-Bialowolska & Eileen McNeely, 2021. "The Role of Financial Fragility and Financial Control for Well-Being," Social Indicators Research: An International and Interdisciplinary Journal for Quality-of-Life Measurement, Springer, vol. 155(3), pages 1137-1157, June.
    8. Tian, Dan & Hu, Xiao & Qian, Yuchen & Li, Jiang, 2024. "Exploring the scientific impact of negative results," Journal of Informetrics, Elsevier, vol. 18(1).
    9. Jasper Brinkerink, 2023. "When Shooting for the Stars Becomes Aiming for Asterisks: P-Hacking in Family Business Research," Entrepreneurship Theory and Practice, , vol. 47(2), pages 304-343, March.
    10. Arnaud Vaganay, 2016. "Cluster Sampling Bias in Government-Sponsored Evaluations: A Correlational Study of Employment and Welfare Pilots in England," PLOS ONE, Public Library of Science, vol. 11(8), pages 1-21, August.
    11. David Winkelmann & Marius Ötting & Christian Deutscher & Tomasz Makarewicz, 2024. "Are Betting Markets Inefficient? Evidence From Simulations and Real Data," Journal of Sports Economics, , vol. 25(1), pages 54-97, January.
    12. Graham Elliott & Nikolay Kudrin & Kaspar Wüthrich, 2022. "Detecting p‐Hacking," Econometrica, Econometric Society, vol. 90(2), pages 887-906, March.
    13. Christian Heise & Joshua M. Pearce, 2020. "From Open Access to Open Science: The Path From Scientific Reality to Open Scientific Communication," SAGE Open, , vol. 10(2), pages 21582440209, May.
    14. Brodeur, Abel & Cook, Nikolai & Heyes, Anthony, 2022. "We Need to Talk about Mechanical Turk: What 22,989 Hypothesis Tests Tell us about p-Hacking and Publication Bias in Online Experiments," GLO Discussion Paper Series 1157, Global Labor Organization (GLO).
    15. Polák, Petr, 2017. "The productivity paradox: A meta-analysis," Information Economics and Policy, Elsevier, vol. 38(C), pages 38-54.
    16. Mei Tian & Yan Su & Xin Ru, 2016. "Perish or Publish in China: Pressures on Young Chinese Scholars to Publish in Internationally Indexed Journals," Publications, MDPI, vol. 4(2), pages 1-16, April.
    17. Camilo Germán Alberto Pérez Chaparro & Philipp Zech & Felipe Schuch & Bernd Wolfarth & Michael Rapp & Andreas Heiβel, 2018. "Effects of aerobic and resistance exercise alone or combined on strength and hormone outcomes for people living with HIV. A meta-analysis," PLOS ONE, Public Library of Science, vol. 13(9), pages 1-21, September.
    18. Graham Elliott & Nikolay Kudrin & Kaspar Wuthrich, 2022. "The Power of Tests for Detecting $p$-Hacking," Papers 2205.07950, arXiv.org, revised Apr 2024.
    19. Martin E Héroux & Janet L Taylor & Simon C Gandevia, 2015. "The Use and Abuse of Transcranial Magnetic Stimulation to Modulate Corticospinal Excitability in Humans," PLOS ONE, Public Library of Science, vol. 10(12), pages 1-10, December.
    20. Tracey L Weissgerber, 2021. "Learning from the past to develop data analysis curricula for the future," PLOS Biology, Public Library of Science, vol. 19(7), pages 1-3, July.

    More about this item

    Statistics

    Access and download statistics

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:osf:osfxxx:gv25c. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: OSF (email available below). General contact details of provider: https://osf.io/preprints/ .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.