IDEAS home Printed from https://ideas.repec.org/a/bla/istatr/v90y2022i2p397-410.html
   My bibliography  Save this article

Unlucky Number 13? Manipulating Evidence Subject to Snooping

Author

Listed:
  • Uwe Hassler
  • Marc‐Oliver Pohle

Abstract

Questionable research practices have generated considerable recent interest throughout and beyond the scientific community. We subsume such practices involving secret data snooping that influences subsequent statistical inference under the term MESSing (manipulating evidence subject to snooping) and discuss, illustrate and quantify the possibly dramatic effects of several forms of MESSing using an empirical and a simple theoretical example. The empirical example uses numbers from the most popular German lottery, which seem to suggest that 13 is an unlucky number.

Suggested Citation

  • Uwe Hassler & Marc‐Oliver Pohle, 2022. "Unlucky Number 13? Manipulating Evidence Subject to Snooping," International Statistical Review, International Statistical Institute, vol. 90(2), pages 397-410, August.
  • Handle: RePEc:bla:istatr:v:90:y:2022:i:2:p:397-410
    DOI: 10.1111/insr.12488
    as

    Download full text from publisher

    File URL: https://doi.org/10.1111/insr.12488
    Download Restriction: no

    File URL: https://libkey.io/10.1111/insr.12488?utm_source=ideas
    LibKey link: if access is restricted and if your library uses this service, LibKey will redirect you to where you can use your library subscription to access this item
    ---><---

    Other versions of this item:

    References listed on IDEAS

    as
    1. Abel Brodeur & Mathias Lé & Marc Sangnier & Yanos Zylberberg, 2016. "Star Wars: The Empirics Strike Back," American Economic Journal: Applied Economics, American Economic Association, vol. 8(1), pages 1-32, January.
    2. Joe, Harry, 1993. "Tests of uniformity for sets of lotto numbers," Statistics & Probability Letters, Elsevier, vol. 16(3), pages 181-188, February.
    3. John P A Ioannidis, 2005. "Why Most Published Research Findings Are False," PLOS Medicine, Public Library of Science, vol. 2(8), pages 1-1, August.
    4. Pierre J C Chuard & Milan Vrtílek & Megan L Head & Michael D Jennions, 2019. "Evidence that nonsignificant results are sometimes preferred: Reverse P-hacking or selective reporting?," PLOS Biology, Public Library of Science, vol. 17(1), pages 1-7, January.
    5. Leeb, Hannes & Pötscher, Benedikt M., 2005. "Model Selection And Inference: Facts And Fiction," Econometric Theory, Cambridge University Press, vol. 21(1), pages 21-59, February.
    6. Abel Brodeur & Nikolai Cook & Anthony Heyes, 2020. "Methods Matter: p-Hacking and Publication Bias in Causal Analysis in Economics," American Economic Review, American Economic Association, vol. 110(11), pages 3634-3660, November.
    7. Gerber, Alan & Malhotra, Neil, 2008. "Do Statistical Reporting Standards Affect What Is Published? Publication Bias in Two Leading Political Science Journals," Quarterly Journal of Political Science, now publishers, vol. 3(3), pages 313-326, October.
    8. Ronald L. Wasserstein & Nicole A. Lazar, 2016. "The ASA's Statement on p -Values: Context, Process, and Purpose," The American Statistician, Taylor & Francis Journals, vol. 70(2), pages 129-133, May.
    9. Ronald D. Fricker & Katherine Burke & Xiaoyan Han & William H. Woodall, 2019. "Assessing the Statistical Analyses Used in Basic and Applied Social Psychology After Their p-Value Ban," The American Statistician, Taylor & Francis Journals, vol. 73(S1), pages 374-384, March.
    10. Jarque, Carlos M. & Bera, Anil K., 1980. "Efficient tests for normality, homoscedasticity and serial independence of regression residuals," Economics Letters, Elsevier, vol. 6(3), pages 255-259.
    11. Halbert White, 2000. "A Reality Check for Data Snooping," Econometrica, Econometric Society, vol. 68(5), pages 1097-1126, September.
    12. David Romer, 2020. "In Praise of Confidence Intervals," AEA Papers and Proceedings, American Economic Association, vol. 110, pages 55-60, May.
    Full references (including those not matched with items on IDEAS)

    Citations

    Citations are extracted by the CitEc Project, subscribe to its RSS feed for this item.
    as


    Cited by:

    1. Fanhui Meng & Haoming Sun & Jiarong Xie & Chengjun Wang & Jiajing Wu & Yanqing Hu, 2021. "Preference for Number of Friends in Online Social Networks," Future Internet, MDPI, vol. 13(9), pages 1-13, September.

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Felix Chopra & Ingar Haaland & Christopher Roth & Andreas Stegmann, 2024. "The Null Result Penalty," The Economic Journal, Royal Economic Society, vol. 134(657), pages 193-219.
    2. Graham Elliott & Nikolay Kudrin & Kaspar Wuthrich, 2022. "The Power of Tests for Detecting $p$-Hacking," Papers 2205.07950, arXiv.org, revised Apr 2024.
    3. Bruns, Stephan B. & Asanov, Igor & Bode, Rasmus & Dunger, Melanie & Funk, Christoph & Hassan, Sherif M. & Hauschildt, Julia & Heinisch, Dominik & Kempa, Karol & König, Johannes & Lips, Johannes & Verb, 2019. "Reporting errors and biases in published empirical findings: Evidence from innovation research," Research Policy, Elsevier, vol. 48(9), pages 1-1.
    4. Furukawa, Chishio, 2019. "Publication Bias under Aggregation Frictions: Theory, Evidence, and a New Correction Method," EconStor Preprints 194798, ZBW - Leibniz Information Centre for Economics.
    5. Adam Gorajek & Joel Bank & Andrew Staib & Benjamin Malin & Hamish Fitchett, 2021. "Star Wars at Central Banks," RBA Research Discussion Papers rdp2021-02, Reserve Bank of Australia.
    6. Graham Elliott & Nikolay Kudrin & Kaspar Wüthrich, 2022. "Detecting p‐Hacking," Econometrica, Econometric Society, vol. 90(2), pages 887-906, March.
    7. Alexander L. Brown & Taisuke Imai & Ferdinand M. Vieider & Colin F. Camerer, 2024. "Meta-analysis of Empirical Estimates of Loss Aversion," Journal of Economic Literature, American Economic Association, vol. 62(2), pages 485-516, June.
    8. Fernando Hoces de la Guardia & Sean Grant & Edward Miguel, 2021. "A framework for open policy analysis," Science and Public Policy, Oxford University Press, vol. 48(2), pages 154-163.
    9. Eszter Czibor & David Jimenez‐Gomez & John A. List, 2019. "The Dozen Things Experimental Economists Should Do (More of)," Southern Economic Journal, John Wiley & Sons, vol. 86(2), pages 371-432, October.
    10. Brodeur, Abel & Esterling, Kevin & Ankel-Peters, Jörg & Bueno, Natália S & Desposato, Scott & Dreber, Anna & Genovese, Federica & Green, Donald P & Hepplewhite, Matthew & de la Guardia, Fernando Hoces, 2024. "Promoting Reproducibility and Replicability in Political Science," Department of Economics, Working Paper Series qt23n3n3dg, Department of Economics, Institute for Business and Economic Research, UC Berkeley.
    11. Adam Gorajek & Benjamin A. Malin, 2021. "Comment on "Star Wars: The Empirics Strike Back"," Staff Report 629, Federal Reserve Bank of Minneapolis.
    12. Anna Dreber & Magnus Johannesson & Yifan Yang, 2024. "Selective reporting of placebo tests in top economics journals," Economic Inquiry, Western Economic Association International, vol. 62(3), pages 921-932, July.
    13. Brodeur, Abel & Cook, Nikolai & Heyes, Anthony, 2018. "Methods Matter: P-Hacking and Causal Inference in Economics," IZA Discussion Papers 11796, Institute of Labor Economics (IZA).
    14. Andrew Y. Chen & Tom Zimmermann, 2022. "Publication Bias in Asset Pricing Research," Papers 2209.13623, arXiv.org, revised Sep 2023.
    15. Abel Brodeur & Mathias Lé & Marc Sangnier & Yanos Zylberberg, 2016. "Star Wars: The Empirics Strike Back," American Economic Journal: Applied Economics, American Economic Association, vol. 8(1), pages 1-32, January.
    16. Heckelei, Thomas & Huettel, Silke & Odening, Martin & Rommel, Jens, 2021. "The replicability crisis and the p-value debate – what are the consequences for the agricultural and food economics community?," Discussion Papers 316369, University of Bonn, Institute for Food and Resource Economics.
    17. Stephan B. Bruns & David I. Stern, 2019. "Lag length selection and p-hacking in Granger causality testing: prevalence and performance of meta-regression models," Empirical Economics, Springer, vol. 56(3), pages 797-830, March.
    18. Matteo Picchio & Michele Ubaldi, 2024. "Unemployment and health: A meta‐analysis," Journal of Economic Surveys, Wiley Blackwell, vol. 38(4), pages 1437-1472, September.
    19. Abel Brodeur & Nikolai Cook & Carina Neisser, 2024. "p-Hacking, Data type and Data-Sharing Policy," The Economic Journal, Royal Economic Society, vol. 134(659), pages 985-1018.
    20. Dominika Ehrenbergerova & Josef Bajzik & Tomas Havranek, 2023. "When Does Monetary Policy Sway House Prices? A Meta-Analysis," IMF Economic Review, Palgrave Macmillan;International Monetary Fund, vol. 71(2), pages 538-573, June.

    More about this item

    Statistics

    Access and download statistics

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:bla:istatr:v:90:y:2022:i:2:p:397-410. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: Wiley Content Delivery (email available below). General contact details of provider: https://edirc.repec.org/data/isiiinl.html .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.