IDEAS home Printed from https://ideas.repec.org/a/spr/scient/v115y2018i1d10.1007_s11192-018-2655-4.html
   My bibliography  Save this article

NHST is still logically flawed

Author

Listed:
  • Jesper W. Schneider

    (Aarhus University)

Abstract

In this elaborate response to Wu (in Scientometrics, 2018), I maintain that null hypothesis significance testing (NHST) is logically flawed. Wu (2018) disagrees with this claim presented in Schneider (in Scientometrics 102(1):411–432, 2015). In this response, I examine the claim in more depth and demonstrate that since NHST is based on one conditional probability alone and framed in a probabilistic modus tollens framework of reasoning, it is by definition logically invalid. I also argue that disregarding this logically fallacy, as most researchers do, and treating the p value as a heuristic value for dichotomous decisions against the null hypothesis, is a risky business that often leads to false-positive claims.

Suggested Citation

  • Jesper W. Schneider, 2018. "NHST is still logically flawed," Scientometrics, Springer;Akadémiai Kiadó, vol. 115(1), pages 627-635, April.
  • Handle: RePEc:spr:scient:v:115:y:2018:i:1:d:10.1007_s11192-018-2655-4
    DOI: 10.1007/s11192-018-2655-4
    as

    Download full text from publisher

    File URL: http://link.springer.com/10.1007/s11192-018-2655-4
    File Function: Abstract
    Download Restriction: Access to the full text of the articles in this series is restricted.

    File URL: https://libkey.io/10.1007/s11192-018-2655-4?utm_source=ideas
    LibKey link: if access is restricted and if your library uses this service, LibKey will redirect you to where you can use your library subscription to access this item
    ---><---

    As the access to this document is restricted, you may want to search for a different version of it.

    References listed on IDEAS

    as
    1. John P A Ioannidis, 2005. "Why Most Published Research Findings Are False," PLOS Medicine, Public Library of Science, vol. 2(8), pages 1-1, August.
    2. John P. A. Ioannidis & T. D. Stanley & Hristos Doucouliagos, 2017. "The Power of Bias in Economics Research," Economic Journal, Royal Economic Society, vol. 127(605), pages 236-265, October.
    3. Jinshan Wu, 2018. "Is there an intrinsic logical error in null hypothesis significance tests? Commentary on: “Null hypothesis significance tests. A mix-up of two different theories: the basis for widespread confusion an," Scientometrics, Springer;Akadémiai Kiadó, vol. 115(1), pages 621-625, April.
    4. Sellke T. & Bayarri M. J. & Berger J. O., 2001. "Calibration of rho Values for Testing Precise Null Hypotheses," The American Statistician, American Statistical Association, vol. 55, pages 62-71, February.
    5. Jesper W. Schneider, 2015. "Null hypothesis significance tests. A mix-up of two different theories: the basis for widespread confusion and numerous misinterpretations," Scientometrics, Springer;Akadémiai Kiadó, vol. 102(1), pages 411-432, January.
    Full references (including those not matched with items on IDEAS)

    Citations

    Citations are extracted by the CitEc Project, subscribe to its RSS feed for this item.
    as


    Cited by:

    1. Alexandre Galvão Patriota, 2018. "Is NHST logically flawed? Commentary on: “NHST is still logically flawed”," Scientometrics, Springer;Akadémiai Kiadó, vol. 116(3), pages 2189-2191, September.
    2. Engsted, Tom & Schneider, Jesper W., 2023. "Non-Experimental Data, Hypothesis Testing, and the Likelihood Principle: A Social Science Perspective," SocArXiv nztk8, Center for Open Science.
    3. Boris Forthmann & Mark A. Runco, 2020. "An Empirical Test of the Inter-Relationships between Various Bibliometric Creative Scholarship Indicators," Publications, MDPI, vol. 8(2), pages 1-16, June.
    4. Jesper W. Schneider, 2018. "Response to commentary on “Is NHST logically flawed”," Scientometrics, Springer;Akadémiai Kiadó, vol. 116(3), pages 2193-2194, September.

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Jyotirmoy Sarkar, 2018. "Will P†Value Triumph over Abuses and Attacks?," Biostatistics and Biometrics Open Access Journal, Juniper Publishers Inc., vol. 7(4), pages 66-71, July.
    2. Nazila Alinaghi & W. Robert Reed, 2021. "Taxes and Economic Growth in OECD Countries: A Meta-analysis," Public Finance Review, , vol. 49(1), pages 3-40, January.
    3. Isaiah Andrews & Maximilian Kasy, 2019. "Identification of and Correction for Publication Bias," American Economic Review, American Economic Association, vol. 109(8), pages 2766-2794, August.
    4. Stephan B. Bruns & David I. Stern, 2019. "Lag length selection and p-hacking in Granger causality testing: prevalence and performance of meta-regression models," Empirical Economics, Springer, vol. 56(3), pages 797-830, March.
    5. Alexandre Galvão Patriota, 2018. "Is NHST logically flawed? Commentary on: “NHST is still logically flawed”," Scientometrics, Springer;Akadémiai Kiadó, vol. 116(3), pages 2189-2191, September.
    6. Fabo, Brian & Jančoková, Martina & Kempf, Elisabeth & Pástor, Ľuboš, 2021. "Fifty shades of QE: Comparing findings of central bankers and academics," Journal of Monetary Economics, Elsevier, vol. 120(C), pages 1-20.
    7. Brian Fabo & Martina Jancokova & Elisabeth Kempf & Lubos Pastor, 2020. "Fifty Shades of QE: Conflicts of Interest in Economic Research," Working Papers 2020-128, Becker Friedman Institute for Research In Economics.
    8. Hirschauer Norbert & Mußhoff Oliver & Grüner Sven & Frey Ulrich & Theesfeld Insa & Wagner Peter, 2016. "Die Interpretation des p-Wertes – Grundsätzliche Missverständnisse," Journal of Economics and Statistics (Jahrbuecher fuer Nationaloekonomie und Statistik), De Gruyter, vol. 236(5), pages 557-575, October.
    9. Black, Bernard & Hollingsworth, Alex & Nunes, Letícia & Simon, Kosali, 2022. "Simulated power analyses for observational studies: An application to the Affordable Care Act Medicaid expansion," Journal of Public Economics, Elsevier, vol. 213(C).
    10. Hirschauer Norbert & Grüner Sven & Mußhoff Oliver & Becker Claudia, 2019. "Twenty Steps Towards an Adequate Inferential Interpretation of p-Values in Econometrics," Journal of Economics and Statistics (Jahrbuecher fuer Nationaloekonomie und Statistik), De Gruyter, vol. 239(4), pages 703-721, August.
    11. Campbell R. Harvey & Yan Liu, 2020. "False (and Missed) Discoveries in Financial Economics," Papers 2006.04269, arXiv.org.
    12. Jesper W. Schneider, 2015. "Null hypothesis significance tests. A mix-up of two different theories: the basis for widespread confusion and numerous misinterpretations," Scientometrics, Springer;Akadémiai Kiadó, vol. 102(1), pages 411-432, January.
    13. Nosek, Brian A. & Ebersole, Charles R. & DeHaven, Alexander Carl & Mellor, David Thomas, 2018. "The Preregistration Revolution," OSF Preprints 2dxu5, Center for Open Science.
    14. Bruns, Stephan B. & Asanov, Igor & Bode, Rasmus & Dunger, Melanie & Funk, Christoph & Hassan, Sherif M. & Hauschildt, Julia & Heinisch, Dominik & Kempa, Karol & König, Johannes & Lips, Johannes & Verb, 2019. "Reporting errors and biases in published empirical findings: Evidence from innovation research," Research Policy, Elsevier, vol. 48(9), pages 1-1.
    15. Mayo, Deborah & Morey, Richard Donald, 2017. "A Poor Prognosis for the Diagnostic Screening Critique of Statistical Tests," OSF Preprints ps38b, Center for Open Science.
    16. Denes Szucs & John P A Ioannidis, 2017. "Empirical assessment of published effect sizes and power in the recent cognitive neuroscience and psychology literature," PLOS Biology, Public Library of Science, vol. 15(3), pages 1-18, March.
    17. Abel Brodeur & Nikolai Cook & Anthony Heyes, 2020. "Methods Matter: p-Hacking and Publication Bias in Causal Analysis in Economics," American Economic Review, American Economic Association, vol. 110(11), pages 3634-3660, November.
    18. Asatryan, Zareh & Havlik, Annika & Heinemann, Friedrich & Nover, Justus, 2020. "Biases in fiscal multiplier estimates," European Journal of Political Economy, Elsevier, vol. 63(C).
    19. Stanley, T. D. & Doucouliagos, Chris, 2019. "Practical Significance, Meta-Analysis and the Credibility of Economics," IZA Discussion Papers 12458, Institute of Labor Economics (IZA).
    20. Christopher Snyder & Ran Zhuo, 2018. "Sniff Tests as a Screen in the Publication Process: Throwing out the Wheat with the Chaff," NBER Working Papers 25058, National Bureau of Economic Research, Inc.

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:spr:scient:v:115:y:2018:i:1:d:10.1007_s11192-018-2655-4. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: Sonal Shukla or Springer Nature Abstracting and Indexing (email available below). General contact details of provider: http://www.springer.com .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.