IDEAS home Printed from https://ideas.repec.org/a/ejw/journl/v21y2024i1p92-112.html
   My bibliography  Save this article

What Is the False Discovery Rate in Empirical Research?

Author

Listed:
  • Tom Engsted

Abstract

A scientific discovery in empirical research, e.g., establishing a causal relationship between two variables, is typically based on rejecting a statistical null hypothesis of no relationship. What is the probability that such a rejection is a mistake? This probability is not controlled by the significance level of the test which is typically set at 5 percent. Statistically, the ‘False Discovery Rate’ (FDR) is the fraction of null rejections that are false. FDR depends on the significance level, the power of the test, and the prior probability that the null is true. All else equal, the higher the prior probability, the higher is the FDR. Economists have different views on how to assess this prior probability. I argue that for both statistical and economic reasons, the prior probability of the null should in general be quite high and, thus, the FDR in empirical economics is high, i.e., substantially higher than 5 percent. This may be a contributing factor behind the replication crisis that also haunts economics. Finally, I discuss conventional and newly proposed stricter significance thresholds and, more generally, the problems that passively observed economic and social science data pose for the traditional statistical testing paradigm.

Suggested Citation

  • Tom Engsted, 2024. "What Is the False Discovery Rate in Empirical Research?," Econ Journal Watch, Econ Journal Watch, vol. 21(1), pages 1-92–112, March.
  • Handle: RePEc:ejw:journl:v:21:y:2024:i:1:p:92-112
    as

    Download full text from publisher

    File URL: https://econjwatch.org/File+download/1299/EngstedMar2024.pdf?mimetype=pdf
    Download Restriction: no

    File URL: https://econjwatch.org/1346
    Download Restriction: no
    ---><---

    References listed on IDEAS

    as
    1. Alberto Abadie, 2020. "Statistical Nonsignificance in Empirical Economics," American Economic Review: Insights, American Economic Association, vol. 2(2), pages 193-208, June.
    2. Startz, Richard, 2014. "Choosing the More Likely Hypothesis," Foundations and Trends(R) in Econometrics, now publishers, vol. 7(2), pages 119-189, November.
    3. Blakeley B. McShane & David Gal, 2017. "Rejoinder: Statistical Significance and the Dichotomization of Evidence," Journal of the American Statistical Association, Taylor & Francis Journals, vol. 112(519), pages 904-908, July.
    4. Daniel J. Benjamin & James O. Berger & Magnus Johannesson & Brian A. Nosek & E.-J. Wagenmakers & Richard Berk & Kenneth A. Bollen & Björn Brembs & Lawrence Brown & Colin Camerer & David Cesarini & Chr, 2018. "Redefine statistical significance," Nature Human Behaviour, Nature, vol. 2(1), pages 6-10, January.
      • Daniel Benjamin & James Berger & Magnus Johannesson & Brian Nosek & E. Wagenmakers & Richard Berk & Kenneth Bollen & Bjorn Brembs & Lawrence Brown & Colin Camerer & David Cesarini & Christopher Chambe, 2017. "Redefine Statistical Significance," Artefactual Field Experiments 00612, The Field Experiments Website.
    5. Abel Brodeur & Nikolai Cook & Anthony Heyes, 2020. "Methods Matter: p-Hacking and Publication Bias in Causal Analysis in Economics," American Economic Review, American Economic Association, vol. 110(11), pages 3634-3660, November.
    6. Tom Engsted, 2002. "Measures of Fit for Rational Expectations Models," Journal of Economic Surveys, Wiley Blackwell, vol. 16(3), pages 301-355, July.
    7. De Long, J Bradford & Lang, Kevin, 1992. "Are All Economic Hypotheses False?," Journal of Political Economy, University of Chicago Press, vol. 100(6), pages 1257-1272, December.
    8. John P. A. Ioannidis & T. D. Stanley & Hristos Doucouliagos, 2017. "The Power of Bias in Economics Research," Economic Journal, Royal Economic Society, vol. 127(605), pages 236-265, October.
    9. Jesper W. Schneider, 2015. "Null hypothesis significance tests. A mix-up of two different theories: the basis for widespread confusion and numerous misinterpretations," Scientometrics, Springer;Akadémiai Kiadó, vol. 102(1), pages 411-432, January.
    10. Blakeley B. McShane & David Gal, 2017. "Statistical Significance and the Dichotomization of Evidence," Journal of the American Statistical Association, Taylor & Francis Journals, vol. 112(519), pages 885-895, July.
    Full references (including those not matched with items on IDEAS)

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Guillaume Coqueret, 2023. "Forking paths in financial economics," Papers 2401.08606, arXiv.org.
    2. Furukawa, Chishio, 2019. "Publication Bias under Aggregation Frictions: Theory, Evidence, and a New Correction Method," EconStor Preprints 194798, ZBW - Leibniz Information Centre for Economics.
    3. Brodeur, Abel & Cook, Nikolai & Neisser, Carina, 2022. "P-Hacking, Data Type and Data-Sharing Policy," IZA Discussion Papers 15586, Institute of Labor Economics (IZA).
    4. Felix Holzmeister & Magnus Johannesson & Robert Böhm & Anna Dreber & Jürgen Huber & Michael Kirchler, 2023. "Heterogeneity in effect size estimates: Empirical evidence and practical implications," Working Papers 2023-17, Faculty of Economics and Statistics, Universität Innsbruck.
    5. Engsted, Tom & Schneider, Jesper W., 2023. "Non-Experimental Data, Hypothesis Testing, and the Likelihood Principle: A Social Science Perspective," SocArXiv nztk8, Center for Open Science.
    6. Luigi Pace & Alessandra Salvan, 2020. "Likelihood, Replicability and Robbins' Confidence Sequences," International Statistical Review, International Statistical Institute, vol. 88(3), pages 599-615, December.
    7. Sadri, Arash, 2022. "The Ultimate Cause of the “Reproducibility Crisis”: Reductionist Statistics," MetaArXiv yxba5, Center for Open Science.
    8. Abel Brodeur & Nikolai Cook & Anthony Heyes, 2020. "Methods Matter: p-Hacking and Publication Bias in Causal Analysis in Economics," American Economic Review, American Economic Association, vol. 110(11), pages 3634-3660, November.
    9. Jae H. Kim, 2022. "Moving to a world beyond p-value," Review of Managerial Science, Springer, vol. 16(8), pages 2467-2493, November.
    10. Holla,Alaka & Bendini,Maria Magdalena & Dinarte Diaz,Lelys Ileana & Trako,Iva, 2021. "Is Investment in Preprimary Education Too Low ? Lessons from (Quasi) ExperimentalEvidence across Countries," Policy Research Working Paper Series 9723, The World Bank.
    11. Stanley, T. D. & Doucouliagos, Chris, 2019. "Practical Significance, Meta-Analysis and the Credibility of Economics," IZA Discussion Papers 12458, Institute of Labor Economics (IZA).
    12. Anna Sokolova, 2023. "Marginal Propensity to Consume and Unemployment: a Meta-analysis," Review of Economic Dynamics, Elsevier for the Society for Economic Dynamics, vol. 51, pages 813-846, December.
    13. Nick Huntington‐Klein & Andreu Arenas & Emily Beam & Marco Bertoni & Jeffrey R. Bloem & Pralhad Burli & Naibin Chen & Paul Grieco & Godwin Ekpe & Todd Pugatch & Martin Saavedra & Yaniv Stopnitzky, 2021. "The influence of hidden researcher decisions in applied microeconomics," Economic Inquiry, Western Economic Association International, vol. 59(3), pages 944-960, July.
    14. Christopher Snyder & Ran Zhuo, 2018. "Sniff Tests as a Screen in the Publication Process: Throwing out the Wheat with the Chaff," NBER Working Papers 25058, National Bureau of Economic Research, Inc.
    15. Gechert, Sebastian & Mey, Bianka & Opatrny, Matej & Havranek, Tomas & Stanley, T. D. & Bom, Pedro R. D. & Doucouliagos, Hristos & Heimberger, Philipp & Irsova, Zuzana & Rachinger, Heiko J., 2023. "Conventional Wisdom, Meta-Analysis, and Research Revision in Economics," EconStor Preprints 280745, ZBW - Leibniz Information Centre for Economics.
    16. Nelson, Jon Paul, 2020. "Fixed-effect versus random-effects meta-analysis in economics: A study of pass-through rates for alcohol beverage excise taxes," Economics Discussion Papers 2020-1, Kiel Institute for the World Economy (IfW Kiel).
    17. Dreber, Anna & Johannesson, Magnus, 2023. "A framework for evaluating reproducibility and replicability in economics," Ruhr Economic Papers 1055, RWI - Leibniz-Institut für Wirtschaftsforschung, Ruhr-University Bochum, TU Dortmund University, University of Duisburg-Essen.
    18. Brian Fabo & Martina Jancokova & Elisabeth Kempf & Lubos Pastor, 2020. "Fifty Shades of QE: Conflicts of Interest in Economic Research," Working and Discussion Papers WP 5/2020, Research Department, National Bank of Slovakia.
    19. Rinne, Sonja, 2024. "Estimating the merit-order effect using coarsened exact matching: Reconciling theory with the empirical results to improve policy implications," Energy Policy, Elsevier, vol. 185(C).
    20. Isaiah Andrews & Maximilian Kasy, 2019. "Identification of and Correction for Publication Bias," American Economic Review, American Economic Association, vol. 109(8), pages 2766-2794, August.

    More about this item

    Keywords

    Hypothesis testing; false discoveries; replication crisis; prior probabilities; Bayesian analysis; statistical versus economic significance;
    All these keywords.

    JEL classification:

    • B41 - Schools of Economic Thought and Methodology - - Economic Methodology - - - Economic Methodology
    • C11 - Mathematical and Quantitative Methods - - Econometric and Statistical Methods and Methodology: General - - - Bayesian Analysis: General
    • C12 - Mathematical and Quantitative Methods - - Econometric and Statistical Methods and Methodology: General - - - Hypothesis Testing: General
    • C18 - Mathematical and Quantitative Methods - - Econometric and Statistical Methods and Methodology: General - - - Methodolical Issues: General
    • C52 - Mathematical and Quantitative Methods - - Econometric Modeling - - - Model Evaluation, Validation, and Selection

    Statistics

    Access and download statistics

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:ejw:journl:v:21:y:2024:i:1:p:92-112. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: Jason Briggeman (email available below). General contact details of provider: https://edirc.repec.org/data/edgmuus.html .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.