IDEAS home Printed from https://ideas.repec.org/p/mar/magkse/201603.html
   My bibliography  Save this paper

The Fragility of Meta-Regression Models in Observational Research

Author

Listed:
  • Stephan B. Bruns

    (University of Kassel)

Abstract

Many meta-regression analyses that synthesize estimates from primary studies have now been published in economics. Meta-regression models attempt to infer the presence of genuine empirical effects even if the authors of primary studies select statistically significant and theory-confirming estimates for publication. Meta-regression models were originally developed for the synthesis of experimental research where randomization ensures unbiased and consistent estimation of the effect of interest. Most economics research is, however, observational and authors of primary studies can search across different regression specifications for statistically significant and theory-confirming estimates. Each regression specification may possibly suffer from biases such as omitted-variable biases that result in biased and inconsistent estimation of the effect of interest. We show that if the authors of primary studies search for statistically significant and theory-confirming estimates, meta-regression models tend to systematically make false-positive findings of genuine empirical effects. The ubiquity of such search processes for specific results may limit the applicability of meta-regression models in identifying genuine empirical effects in economics.

Suggested Citation

  • Stephan B. Bruns, 2016. "The Fragility of Meta-Regression Models in Observational Research," MAGKS Papers on Economics 201603, Philipps-Universität Marburg, Faculty of Business Administration and Economics, Department of Economics (Volkswirtschaftliche Abteilung).
  • Handle: RePEc:mar:magkse:201603
    as

    Download full text from publisher

    File URL: http://www.uni-marburg.de/fb02/makro/forschung/magkspapers/paper_2016/03-2016_bruns.pdf
    File Function: First 201603
    Download Restriction: no
    ---><---

    References listed on IDEAS

    as
    1. Adam, Antonis & Kammas, Pantelis & Lagou, Athina, 2013. "The effect of globalization on capital taxation: What have we learned after 20years of empirical studies?," Journal of Macroeconomics, Elsevier, vol. 35(C), pages 199-209.
    2. Abel Brodeur & Mathias Lé & Marc Sangnier & Yanos Zylberberg, 2016. "Star Wars: The Empirics Strike Back," American Economic Journal: Applied Economics, American Economic Association, vol. 8(1), pages 1-32, January.
    3. Frey, Bruno S, 2003. "Publishing as Prostitution?--Choosing between One's Own Ideas and Academic Success," Public Choice, Springer, vol. 116(1-2), pages 205-223, July.
    4. Gerber, Alan & Malhotra, Neil, 2008. "Do Statistical Reporting Standards Affect What Is Published? Publication Bias in Two Leading Political Science Journals," Quarterly Journal of Political Science, now publishers, vol. 3(3), pages 313-326, October.
    5. De Long, J Bradford & Lang, Kevin, 1992. "Are All Economic Hypotheses False?," Journal of Political Economy, University of Chicago Press, vol. 100(6), pages 1257-1272, December.
    6. David Card & Stefano DellaVigna, 2013. "Nine Facts about Top Journals in Economics," Journal of Economic Literature, American Economic Association, vol. 51(1), pages 144-161, March.
    7. T. D. Stanley, 2008. "Meta‐Regression Methods for Detecting and Estimating Empirical Effects in the Presence of Publication Selection," Oxford Bulletin of Economics and Statistics, Department of Economics, University of Oxford, vol. 70(1), pages 103-127, February.
    8. Mark Koetse & Raymond Florax & Henri Groot, 2010. "Consequences of effect size heterogeneity for meta-analysis: a Monte Carlo study," Statistical Methods & Applications, Springer;Società Italiana di Statistica, vol. 19(2), pages 217-236, June.
    9. Card, David & Krueger, Alan B, 1995. "Time-Series Minimum-Wage Studies: A Meta-analysis," American Economic Review, American Economic Association, vol. 85(2), pages 238-243, May.
    10. Leamer, Edward E, 1983. "Let's Take the Con Out of Econometrics," American Economic Review, American Economic Association, vol. 73(1), pages 31-43, March.
    11. Eva Vivalt, 0. "How Much Can We Generalize From Impact Evaluations?," Journal of the European Economic Association, European Economic Association, vol. 18(6), pages 3045-3089.
    12. Efendic, Adnan & Pugh, Geoff & Adnett, Nick, 2011. "Institutions and economic performance: A meta-regression analysis," European Journal of Political Economy, Elsevier, vol. 27(3), pages 586-599, September.
    13. Edward L. Glaeser, 2006. "Researcher Incentives and Empirical Methods," NBER Technical Working Papers 0329, National Bureau of Economic Research, Inc.
    14. repec:bla:econom:v:47:y:1980:i:188:p:387-406 is not listed on IDEAS
    15. Sims, Christopher A, 1988. "Uncertainty across Models," American Economic Review, American Economic Association, vol. 78(2), pages 163-167, May.
    16. Chris Doucouliagos & T.D. Stanley, 2013. "Are All Economic Facts Greatly Exaggerated? Theory Competition And Selectivity," Journal of Economic Surveys, Wiley Blackwell, vol. 27(2), pages 316-339, April.
    Full references (including those not matched with items on IDEAS)

    Citations

    Citations are extracted by the CitEc Project, subscribe to its RSS feed for this item.
    as


    Cited by:

    1. Stephan B Bruns & John P A Ioannidis, 2016. "p-Curve and p-Hacking in Observational Research," PLOS ONE, Public Library of Science, vol. 11(2), pages 1-13, February.

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Stephan B. Bruns, 2017. "Meta-Regression Models and Observational Research," Oxford Bulletin of Economics and Statistics, Department of Economics, University of Oxford, vol. 79(5), pages 637-653, October.
    2. Abel Brodeur & Mathias Lé & Marc Sangnier & Yanos Zylberberg, 2016. "Star Wars: The Empirics Strike Back," American Economic Journal: Applied Economics, American Economic Association, vol. 8(1), pages 1-32, January.
    3. Stephan B. Bruns, 2013. "Identifying Genuine Effects in Observational Research by Means of Meta-Regressions," Jena Economics Research Papers 2013-040, Friedrich-Schiller-University Jena.
    4. Abel Brodeur & Nikolai Cook & Anthony Heyes, 2020. "Methods Matter: p-Hacking and Publication Bias in Causal Analysis in Economics," American Economic Review, American Economic Association, vol. 110(11), pages 3634-3660, November.
    5. Obsa Urgessa Ayana & Jima Degaga, 2022. "Effects of rural electrification on household welfare: a meta-regression analysis," International Review of Economics, Springer;Happiness Economics and Interpersonal Relations (HEIRS), vol. 69(2), pages 209-261, June.
    6. Stanley, T. D. & Doucouliagos, Chris, 2019. "Practical Significance, Meta-Analysis and the Credibility of Economics," IZA Discussion Papers 12458, Institute of Labor Economics (IZA).
    7. Sebastian Gechert & Tomas Havranek & Zuzana Irsova & Dominika Kolcunova, 2022. "Measuring Capital-Labor Substitution: The Importance of Method Choices and Publication Bias," Review of Economic Dynamics, Elsevier for the Society for Economic Dynamics, vol. 45, pages 55-82, July.
    8. Brodeur, Abel & Cook, Nikolai & Heyes, Anthony, 2018. "Methods Matter: P-Hacking and Causal Inference in Economics," IZA Discussion Papers 11796, Institute of Labor Economics (IZA).
    9. Hippolyte W. Balima & Eric G. Kilama & Rene Tapsoba, 2017. "Settling the Inflation Targeting Debate: Lights from a Meta-Regression Analysis," IMF Working Papers 2017/213, International Monetary Fund.
    10. Abel Brodeur & Nikolai Cook & Carina Neisser, 2024. "p-Hacking, Data type and Data-Sharing Policy," The Economic Journal, Royal Economic Society, vol. 134(659), pages 985-1018.
    11. Cristina Blanco-Perez & Abel Brodeur, 2020. "Publication Bias and Editorial Statement on Negative Findings," The Economic Journal, Royal Economic Society, vol. 130(629), pages 1226-1247.
    12. Paldam, Martin, 2018. "A model of the representative economist, as researcher and policy advisor," European Journal of Political Economy, Elsevier, vol. 54(C), pages 5-15.
    13. Furukawa, Chishio, 2019. "Publication Bias under Aggregation Frictions: Theory, Evidence, and a New Correction Method," EconStor Preprints 194798, ZBW - Leibniz Information Centre for Economics.
    14. Tomas Havranek & Anna Sokolova, 2020. "Do Consumers Really Follow a Rule of Thumb? Three Thousand Estimates from 144 Studies Say 'Probably Not'," Review of Economic Dynamics, Elsevier for the Society for Economic Dynamics, vol. 35, pages 97-122, January.
    15. Jindrich Matousek & Tomas Havranek & Zuzana Irsova, 2022. "Individual discount rates: a meta-analysis of experimental evidence," Experimental Economics, Springer;Economic Science Association, vol. 25(1), pages 318-358, February.
    16. Paldam, Martin, 2015. "Meta-analysis in a nutshell: Techniques and general findings," Economics - The Open-Access, Open-Assessment E-Journal (2007-2020), Kiel Institute for the World Economy (IfW Kiel), vol. 9, pages 1-14.
    17. Christopher Snyder & Ran Zhuo, 2018. "Sniff Tests as a Screen in the Publication Process: Throwing out the Wheat with the Chaff," NBER Working Papers 25058, National Bureau of Economic Research, Inc.
    18. Dimos, Christos & Pugh, Geoff & Hisarciklilar, Mehtap & Talam, Ema & Jackson, Ian, 2022. "The relative effectiveness of R&D tax credits and R&D subsidies: A comparative meta-regression analysis," Technovation, Elsevier, vol. 115(C).
    19. Brian Fabo & Martina Jancokova & Elisabeth Kempf & Lubos Pastor, 2020. "Fifty Shades of QE: Conflicts of Interest in Economic Research," Working and Discussion Papers WP 5/2020, Research Department, National Bank of Slovakia.
    20. Isaiah Andrews & Maximilian Kasy, 2019. "Identification of and Correction for Publication Bias," American Economic Review, American Economic Association, vol. 109(8), pages 2766-2794, August.

    More about this item

    Keywords

    Meta-regression; meta-analysis; p-hacking; publication bias; omitted-variable bias; sampling variability; sampling error; Monte Carlo simulation;
    All these keywords.

    JEL classification:

    • C12 - Mathematical and Quantitative Methods - - Econometric and Statistical Methods and Methodology: General - - - Hypothesis Testing: General
    • C15 - Mathematical and Quantitative Methods - - Econometric and Statistical Methods and Methodology: General - - - Statistical Simulation Methods: General
    • C40 - Mathematical and Quantitative Methods - - Econometric and Statistical Methods: Special Topics - - - General

    NEP fields

    This paper has been announced in the following NEP Reports:

    Statistics

    Access and download statistics

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:mar:magkse:201603. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: Bernd Hayo (email available below). General contact details of provider: https://edirc.repec.org/data/vamarde.html .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.