IDEAS home Printed from https://ideas.repec.org/p/iza/izadps/dp17187.html
   My bibliography  Save this paper

Optimal Pre-analysis Plans: Statistical Decisions Subject to Implementability

Author

Listed:
  • Kasy, Maximilian

    (University of Oxford)

  • Spiess, Jann

    (Stanford University)

Abstract

What is the purpose of pre-analysis plans, and how should they be designed? We model the interaction between an agent who analyzes data and a principal who makes a decision based on agent reports. The agent could be the manufacturer of a new drug, and the principal a regulator deciding whether the drug is approved. Or the agent could be a researcher submitting a research paper, and the principal an editor deciding whether it is published. The agent decides which statistics to report to the principal. The principal cannot verify whether the analyst reported selectively. Absent a pre-analysis message, if there are conflicts of interest, then many desirable decision rules cannot be implemented. Allowing the agent to send a message before seeing the data increases the set of decisions rules that can be implemented, and allows the principal to leverage agent expertise. The optimal mechanisms that we characterize require pre-analysis plans. Applying these results to hypothesis testing, we show that optimal rejection rules pre-register a valid test, and make worst-case assumptions about unreported statistics. Optimal tests can be found as a solution to a linear-programming problem.

Suggested Citation

  • Kasy, Maximilian & Spiess, Jann, 2024. "Optimal Pre-analysis Plans: Statistical Decisions Subject to Implementability," IZA Discussion Papers 17187, Institute of Labor Economics (IZA).
  • Handle: RePEc:iza:izadps:dp17187
    as

    Download full text from publisher

    File URL: https://docs.iza.org/dp17187.pdf
    Download Restriction: no
    ---><---

    References listed on IDEAS

    as
    1. Nathan Yoder, 2022. "Designing Incentives for Heterogeneous Researchers," Journal of Political Economy, University of Chicago Press, vol. 130(8), pages 2018-2054.
    2. Mathis, Jérôme, 2008. "Full revelation of information in Sender-Receiver games of persuasion," Journal of Economic Theory, Elsevier, vol. 143(1), pages 571-584, November.
    3. Alfredo Di Tillio & Marco Ottaviani & Peter Norman Sørensen, 2017. "Persuasion Bias in Science: Can Economics Help?," Economic Journal, Royal Economic Society, vol. 127(605), pages 266-304, October.
    4. Garret Christensen & Edward Miguel, 2018. "Transparency, Reproducibility, and the Credibility of Economics Research," Journal of Economic Literature, American Economic Association, vol. 56(3), pages 920-980, September.
    5. Alfredo Di Tillio & Marco Ottaviani & Peter Norman Sørensen, 2021. "Strategic Sample Selection," Econometrica, Econometric Society, vol. 89(2), pages 911-953, March.
    6. Emeric Henry & Marco Ottaviani, 2019. "Research and the Approval Process: The Organization of Persuasion," American Economic Review, American Economic Association, vol. 109(3), pages 911-955, March.
    7. Benjamin A. Olken, 2015. "Promises and Perils of Pre-analysis Plans," Journal of Economic Perspectives, American Economic Association, vol. 29(3), pages 61-80, Summer.
    8. Aleksey Tetenov, 2016. "An economic theory of statistical testing," CeMMAP working papers CWP50/16, Centre for Microdata Methods and Practice, Institute for Fiscal Studies.
    9. Isaiah Andrews & Maximilian Kasy, 2019. "Identification of and Correction for Publication Bias," American Economic Review, American Economic Association, vol. 109(8), pages 2766-2794, August.
    10. Myerson, Roger B, 1986. "Multistage Games with Communication," Econometrica, Econometric Society, vol. 54(2), pages 323-358, March.
    11. Aleksey Tetenov, 2016. "An economic theory of statistical testing," CeMMAP working papers 50/16, Institute for Fiscal Studies.
    12. Edward L. Glaeser, 2006. "Researcher Incentives and Empirical Methods," NBER Technical Working Papers 0329, National Bureau of Economic Research, Inc.
    13. Emir Kamenica, 2019. "Bayesian Persuasion and Information Design," Annual Review of Economics, Annual Reviews, vol. 11(1), pages 249-272, August.
    14. Edward Miguel, 2021. "Evidence on Research Transparency in Economics," Journal of Economic Perspectives, American Economic Association, vol. 35(3), pages 193-214, Summer.
    15. Gneiting, Tilmann & Raftery, Adrian E., 2007. "Strictly Proper Scoring Rules, Prediction, and Estimation," Journal of the American Statistical Association, American Statistical Association, vol. 102, pages 359-378, March.
    16. Alexander Frankel & Maximilian Kasy, 2022. "Which Findings Should Be Published?," American Economic Journal: Microeconomics, American Economic Association, vol. 14(1), pages 1-38, February.
    17. Isaiah Andrews & Jesse M. Shapiro, 2021. "A Model of Scientific Communication," Econometrica, Econometric Society, vol. 89(5), pages 2117-2142, September.
    18. Alfredo Di Tillio & Marco Ottaviani & Peter Norman Sørensen, 2017. "Persuasion Bias in Science: Can Economics Help?," Economic Journal, Royal Economic Society, vol. 127(605), pages 266-304, October.
    19. Lucas C. Coffman & Muriel Niederle, 2015. "Pre-analysis Plans Have Limited Upside, Especially Where Replications Are Feasible," Journal of Economic Perspectives, American Economic Association, vol. 29(3), pages 81-98, Summer.
    20. Jérôme Mathis, 2008. "Full Revelation of Information in Sender-Receiver Games of Persuasion," Post-Print hal-02445381, HAL.
    21. Jacob Glazer & Ariel Rubinstein, 2004. "On Optimal Rules of Persuasion," Econometrica, Econometric Society, vol. 72(6), pages 1715-1736, November.
    22. repec:hal:spmain:info:hdl:2441/1gr6n3t28b94tafji6op8tlqs1 is not listed on IDEAS
    Full references (including those not matched with items on IDEAS)

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Maximilian Kasy & Jann Spiess, 2024. "Optimal Pre-Analysis Plans: Statistical Decisions Subject to Implementability," CESifo Working Paper Series 11258, CESifo.
    2. Maximilian Kasy & Jann Spiess, 2022. "Rationalizing Pre-Analysis Plans:Statistical Decisions Subject to Implementability," Economics Series Working Papers 975, University of Oxford, Department of Economics.
    3. Maximilian Kasy & Jann Spiess, 2022. "Optimal Pre-Analysis Plans: Statistical Decisions Subject to Implementability," Papers 2208.09638, arXiv.org, revised Jul 2024.
    4. Alexander Frankel & Maximilian Kasy, 2022. "Which Findings Should Be Published?," American Economic Journal: Microeconomics, American Economic Association, vol. 14(1), pages 1-38, February.
    5. Felgenhauer, Mike, 2021. "Experimentation and manipulation with preregistration," Games and Economic Behavior, Elsevier, vol. 130(C), pages 400-408.
    6. Brodeur, Abel & Cook, Nikolai & Hartley, Jonathan & Heyes, Anthony, 2022. "Do Pre-Registration and Pre-analysis Plans Reduce p-Hacking and Publication Bias?," MetaArXiv uxf39, Center for Open Science.
    7. Davide Viviano & Kaspar Wuthrich & Paul Niehaus, 2021. "A model of multiple hypothesis testing," Papers 2104.13367, arXiv.org, revised Apr 2024.
    8. So, Tony, 2020. "Classroom experiments as a replication device," Journal of Behavioral and Experimental Economics (formerly The Journal of Socio-Economics), Elsevier, vol. 86(C).
    9. Herresthal, Claudia, 2022. "Hidden testing and selective disclosure of evidence," Journal of Economic Theory, Elsevier, vol. 200(C).
    10. Furukawa, Chishio, 2019. "Publication Bias under Aggregation Frictions: Theory, Evidence, and a New Correction Method," EconStor Preprints 194798, ZBW - Leibniz Information Centre for Economics.
    11. Bruno Ferman & Cristine Pinto & Vitor Possebom, 2020. "Cherry Picking with Synthetic Controls," Journal of Policy Analysis and Management, John Wiley & Sons, Ltd., vol. 39(2), pages 510-532, March.
    12. Pedro Carneiro & Sokbae Lee & Daniel Wilhelm, 2020. "Optimal data collection for randomized control trials," The Econometrics Journal, Royal Economic Society, vol. 23(1), pages 1-31.
    13. Maurizio Canavari & Andreas C. Drichoutis & Jayson L. Lusk & Rodolfo M. Nayga, Jr., 2018. "How to run an experimental auction: A review of recent advances," Working Papers 2018-5, Agricultural University of Athens, Department Of Agricultural Economics.
    14. Abel Brodeur & Scott Carrell & David Figlio & Lester Lusher, 2023. "Unpacking P-hacking and Publication Bias," American Economic Review, American Economic Association, vol. 113(11), pages 2974-3002, November.
    15. Matthias Lang, 2020. "Mechanism Design with Narratives," CESifo Working Paper Series 8502, CESifo.
    16. Graham Elliott & Nikolay Kudrin & Kaspar Wuthrich, 2022. "The Power of Tests for Detecting $p$-Hacking," Papers 2205.07950, arXiv.org, revised Apr 2024.
    17. Felix Chopra & Ingar Haaland & Christopher Roth & Andreas Stegmann, 2024. "The Null Result Penalty," The Economic Journal, Royal Economic Society, vol. 134(657), pages 193-219.
    18. Dominika Ehrenbergerova & Josef Bajzik & Tomas Havranek, 2023. "When Does Monetary Policy Sway House Prices? A Meta-Analysis," IMF Economic Review, Palgrave Macmillan;International Monetary Fund, vol. 71(2), pages 538-573, June.
    19. Anna Dreber & Magnus Johannesson & Yifan Yang, 2024. "Selective reporting of placebo tests in top economics journals," Economic Inquiry, Western Economic Association International, vol. 62(3), pages 921-932, July.
    20. Alfredo Di Tillio & Marco Ottaviani & Peter Norman Sørensen, 2017. "Persuasion Bias in Science: Can Economics Help?," Economic Journal, Royal Economic Society, vol. 127(605), pages 266-304, October.

    More about this item

    Keywords

    pre-analysis plans; statistical decisions; implementability;
    All these keywords.

    JEL classification:

    • C18 - Mathematical and Quantitative Methods - - Econometric and Statistical Methods and Methodology: General - - - Methodolical Issues: General
    • D8 - Microeconomics - - Information, Knowledge, and Uncertainty
    • I23 - Health, Education, and Welfare - - Education - - - Higher Education; Research Institutions

    NEP fields

    This paper has been announced in the following NEP Reports:

    Statistics

    Access and download statistics

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:iza:izadps:dp17187. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: Holger Hinte (email available below). General contact details of provider: https://edirc.repec.org/data/izaaade.html .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.