IDEAS home Printed from https://ideas.repec.org/p/arx/papers/2005.04141.html
   My bibliography  Save this paper

Critical Values Robust to P-hacking

Author

Listed:
  • Adam McCloskey
  • Pascal Michaillat

Abstract

P-hacking is prevalent in reality but absent from classical hypothesis testing theory. As a consequence, significant results are much more common than they are supposed to be when the null hypothesis is in fact true. In this paper, we build a model of hypothesis testing with p-hacking. From the model, we construct critical values such that, if the values are used to determine significance, and if scientists' p-hacking behavior adjusts to the new significance standards, significant results occur with the desired frequency. Such robust critical values allow for p-hacking so they are larger than classical critical values. To illustrate the amount of correction that p-hacking might require, we calibrate the model using evidence from the medical sciences. In the calibrated model the robust critical value for any test statistic is the classical critical value for the same test statistic with one fifth of the significance level.

Suggested Citation

  • Adam McCloskey & Pascal Michaillat, 2020. "Critical Values Robust to P-hacking," Papers 2005.04141, arXiv.org, revised Dec 2023.
  • Handle: RePEc:arx:papers:2005.04141
    as

    Download full text from publisher

    File URL: http://arxiv.org/pdf/2005.04141
    File Function: Latest version
    Download Restriction: no
    ---><---

    References listed on IDEAS

    as
    1. Abel Brodeur & Mathias Lé & Marc Sangnier & Yanos Zylberberg, 2016. "Star Wars: The Empirics Strike Back," American Economic Journal: Applied Economics, American Economic Association, vol. 8(1), pages 1-32, January.
    2. Kerry Dwan & Douglas G Altman & Juan A Arnaiz & Jill Bloom & An-Wen Chan & Eugenia Cronin & Evelyne Decullier & Philippa J Easterbrook & Erik Von Elm & Carrol Gamble & Davina Ghersi & John P A Ioannid, 2008. "Systematic Review of the Empirical Evidence of Study Publication Bias and Outcome Reporting Bias," PLOS ONE, Public Library of Science, vol. 3(8), pages 1-31, August.
    3. John Gibson & David L. Anderson & John Tressler, 2014. "Which Journal Rankings Best Explain Academic Salaries? Evidence From The University Of California," Economic Inquiry, Western Economic Association International, vol. 52(4), pages 1322-1340, October.
    4. John Gibson & David L. Anderson & John Tressler, 2017. "Citations Or Journal Quality: Which Is Rewarded More In The Academic Labor Market?," Economic Inquiry, Western Economic Association International, vol. 55(4), pages 1945-1965, October.
    5. Lucas C. Coffman & Muriel Niederle, 2015. "Pre-analysis Plans Have Limited Upside, Especially Where Replications Are Feasible," Journal of Economic Perspectives, American Economic Association, vol. 29(3), pages 81-98, Summer.
    6. Yuqing Zheng & Harry M. Kaiser, 2016. "Submission Demand In Core Economics Journals: A Panel Study," Economic Inquiry, Western Economic Association International, vol. 54(2), pages 1319-1338, April.
    Full references (including those not matched with items on IDEAS)

    Citations

    Citations are extracted by the CitEc Project, subscribe to its RSS feed for this item.
    as


    Cited by:

    1. Guillaume Coqueret, 2023. "Forking paths in financial economics," Papers 2401.08606, arXiv.org.

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Bruno Ferman & Cristine Pinto & Vitor Possebom, 2020. "Cherry Picking with Synthetic Controls," Journal of Policy Analysis and Management, John Wiley & Sons, Ltd., vol. 39(2), pages 510-532, March.
    2. Lucas C. Coffman & Muriel Niederle & Alistair J. Wilson, 2017. "A Proposal to Organize and Promote Replications," American Economic Review, American Economic Association, vol. 107(5), pages 41-45, May.
    3. Syed Hasan & Robert Breunig, 2021. "Article length and citation outcomes," Scientometrics, Springer;Akadémiai Kiadó, vol. 126(9), pages 7583-7608, September.
    4. Burlig, Fiona, 2018. "Improving transparency in observational social science research: A pre-analysis plan approach," Economics Letters, Elsevier, vol. 168(C), pages 56-60.
    5. Maurizio Canavari & Andreas C. Drichoutis & Jayson L. Lusk & Rodolfo M. Nayga, Jr., 2018. "How to run an experimental auction: A review of recent advances," Working Papers 2018-5, Agricultural University of Athens, Department Of Agricultural Economics.
    6. Eszter Czibor & David Jimenez‐Gomez & John A. List, 2019. "The Dozen Things Experimental Economists Should Do (More of)," Southern Economic Journal, John Wiley & Sons, vol. 86(2), pages 371-432, October.
    7. Anna Dreber & Magnus Johannesson & Yifan Yang, 2024. "Selective reporting of placebo tests in top economics journals," Economic Inquiry, Western Economic Association International, vol. 62(3), pages 921-932, July.
    8. Abel Brodeur & Nikolai M. Cook & Jonathan S. Hartley & Anthony Heyes, 2024. "Do Preregistration and Preanalysis Plans Reduce p-Hacking and Publication Bias? Evidence from 15,992 Test Statistics and Suggestions for Improvement," Journal of Political Economy Microeconomics, University of Chicago Press, vol. 2(3), pages 527-561.
    9. Mueller-Langer, Frank & Fecher, Benedikt & Harhoff, Dietmar & Wagner, Gert G., 2019. "Replication studies in economics—How many and which papers are chosen for replication, and why?," Research Policy, Elsevier, vol. 48(1), pages 62-83.
    10. María Victoria Anauati & Sebastian Galiani & Ramiro H. Gálvez, 2018. "Differences in citation patterns across journal tiers in economics," Documentos de Trabajo 16701, The Latin American and Caribbean Economic Association (LACEA).
    11. María Victoria Anauati & Sebastian Galiani & Ramiro H. Gálvez, 2020. "Differences In Citation Patterns Across Journal Tiers: The Case Of Economics," Economic Inquiry, Western Economic Association International, vol. 58(3), pages 1217-1232, July.
    12. Josephson, Anna & Michler, Jeffrey D., 2018. "Viewpoint: Beasts of the field? Ethics in agricultural and applied economics," Food Policy, Elsevier, vol. 79(C), pages 1-11.
    13. Felgenhauer, Mike, 2021. "Experimentation and manipulation with preregistration," Games and Economic Behavior, Elsevier, vol. 130(C), pages 400-408.
    14. Brodeur, Abel & Cook, Nikolai & Hartley, Jonathan & Heyes, Anthony, 2022. "Do Pre-Registration and Pre-analysis Plans Reduce p-Hacking and Publication Bias?," MetaArXiv uxf39, Center for Open Science.
    15. Michał Krawczyk, 2015. "The Search for Significance: A Few Peculiarities in the Distribution of P Values in Experimental Psychology Literature," PLOS ONE, Public Library of Science, vol. 10(6), pages 1-19, June.
    16. Robbie C M van Aert & Jelte M Wicherts & Marcel A L M van Assen, 2019. "Publication bias examined in meta-analyses from psychology and medicine: A meta-meta-analysis," PLOS ONE, Public Library of Science, vol. 14(4), pages 1-32, April.
    17. Garret Christensen & Edward Miguel, 2018. "Transparency, Reproducibility, and the Credibility of Economics Research," Journal of Economic Literature, American Economic Association, vol. 56(3), pages 920-980, September.
    18. Jordan C. Stanley & Evan S. Totty, 2024. "Synthetic Data and Social Science Research: Accuracy Assessments and Practical Considerations from the SIPP Synthetic Beta," NBER Chapters, in: Data Privacy Protection and the Conduct of Applied Research: Methods, Approaches and their Consequences, National Bureau of Economic Research, Inc.
    19. Lutz Bornmann & Klaus Wohlrabe, 2019. "Normalisation of citation impact in economics," Scientometrics, Springer;Akadémiai Kiadó, vol. 120(2), pages 841-884, August.
    20. Bogdanoski, Aleksandar & Ofosu, George & Posner, Daniel N, 2019. "Pre-analysis Plans: A Stocktaking," MetaArXiv e4pum, Center for Open Science.

    More about this item

    NEP fields

    This paper has been announced in the following NEP Reports:

    Statistics

    Access and download statistics

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:arx:papers:2005.04141. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: arXiv administrators (email available below). General contact details of provider: http://arxiv.org/ .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.