IDEAS home Printed from https://ideas.repec.org/p/iza/izadps/dp17187.html
   My bibliography  Save this paper

Optimal Pre-analysis Plans: Statistical Decisions Subject to Implementability

Author

Listed:
  • Kasy, Maximilian

    (University of Oxford)

  • Spiess, Jann

    (Stanford University)

Abstract

What is the purpose of pre-analysis plans, and how should they be designed? We model the interaction between an agent who analyzes data and a principal who makes a decision based on agent reports. The agent could be the manufacturer of a new drug, and the principal a regulator deciding whether the drug is approved. Or the agent could be a researcher submitting a research paper, and the principal an editor deciding whether it is published. The agent decides which statistics to report to the principal. The principal cannot verify whether the analyst reported selectively. Absent a pre-analysis message, if there are conflicts of interest, then many desirable decision rules cannot be implemented. Allowing the agent to send a message before seeing the data increases the set of decisions rules that can be implemented, and allows the principal to leverage agent expertise. The optimal mechanisms that we characterize require pre-analysis plans. Applying these results to hypothesis testing, we show that optimal rejection rules pre-register a valid test, and make worst-case assumptions about unreported statistics. Optimal tests can be found as a solution to a linear-programming problem.

Suggested Citation

  • Kasy, Maximilian & Spiess, Jann, 2024. "Optimal Pre-analysis Plans: Statistical Decisions Subject to Implementability," IZA Discussion Papers 17187, Institute of Labor Economics (IZA).
  • Handle: RePEc:iza:izadps:dp17187
    as

    Download full text from publisher

    File URL: https://docs.iza.org/dp17187.pdf
    Download Restriction: no
    ---><---

    Other versions of this item:

    References listed on IDEAS

    as
    1. Nathan Yoder, 2022. "Designing Incentives for Heterogeneous Researchers," Journal of Political Economy, University of Chicago Press, vol. 130(8), pages 2018-2054.
    2. Alfredo Di Tillio & Marco Ottaviani & Peter Norman Sørensen, 2017. "Persuasion Bias in Science: Can Economics Help?," Economic Journal, Royal Economic Society, vol. 127(605), pages 266-304, October.
    3. Garret Christensen & Edward Miguel, 2018. "Transparency, Reproducibility, and the Credibility of Economics Research," Journal of Economic Literature, American Economic Association, vol. 56(3), pages 920-980, September.
    4. Alfredo Di Tillio & Marco Ottaviani & Peter Norman Sørensen, 2021. "Strategic Sample Selection," Econometrica, Econometric Society, vol. 89(2), pages 911-953, March.
    5. Emeric Henry & Marco Ottaviani, 2019. "Research and the Approval Process: The Organization of Persuasion," American Economic Review, American Economic Association, vol. 109(3), pages 911-955, March.
    6. Benjamin A. Olken, 2015. "Promises and Perils of Pre-analysis Plans," Journal of Economic Perspectives, American Economic Association, vol. 29(3), pages 61-80, Summer.
    7. Sylvain Chassang & Gerard Padro I Miquel & Erik Snowberg, 2012. "Selective Trials: A Principal-Agent Approach to Randomized Controlled Experiments," American Economic Review, American Economic Association, vol. 102(4), pages 1279-1309, June.
    8. Isaiah Andrews & Maximilian Kasy, 2019. "Identification of and Correction for Publication Bias," American Economic Review, American Economic Association, vol. 109(8), pages 2766-2794, August.
    9. Ying Gao, 2022. "Inference from Selectively Disclosed Data," Papers 2204.07191, arXiv.org, revised Nov 2023.
    10. Stefano DellaVigna & Devin Pope, 2018. "What Motivates Effort? Evidence and Expert Forecasts," The Review of Economic Studies, Review of Economic Studies Ltd, vol. 85(2), pages 1029-1069.
    11. Lucas C. Coffman & Muriel Niederle, 2015. "Pre-analysis Plans Have Limited Upside, Especially Where Replications Are Feasible," Journal of Economic Perspectives, American Economic Association, vol. 29(3), pages 81-98, Summer.
    12. Edward Miguel, 2021. "Evidence on Research Transparency in Economics," Journal of Economic Perspectives, American Economic Association, vol. 35(3), pages 193-214, Summer.
    13. Jacob Glazer & Ariel Rubinstein, 2004. "On Optimal Rules of Persuasion," Econometrica, Econometric Society, vol. 72(6), pages 1715-1736, November.
    14. Alexander Frankel & Maximilian Kasy, 2022. "Which Findings Should Be Published?," American Economic Journal: Microeconomics, American Economic Association, vol. 14(1), pages 1-38, February.
    15. Myerson, Roger B, 1986. "Multistage Games with Communication," Econometrica, Econometric Society, vol. 54(2), pages 323-358, March.
    16. Mathis, Jérôme, 2008. "Full revelation of information in Sender-Receiver games of persuasion," Journal of Economic Theory, Elsevier, vol. 143(1), pages 571-584, November.
    17. Aleksey Tetenov, 2016. "An economic theory of statistical testing," CeMMAP working papers CWP50/16, Centre for Microdata Methods and Practice, Institute for Fiscal Studies.
    18. Aleksey Tetenov, 2016. "An economic theory of statistical testing," CeMMAP working papers 50/16, Institute for Fiscal Studies.
    19. Emeric Henry & Marco Ottaviani, 2019. "Research and the Approval Process: The Organization of Persuasion," American Economic Review, American Economic Association, vol. 109(3), pages 911-955, March.
    20. Edward L. Glaeser, 2006. "Researcher Incentives and Empirical Methods," NBER Technical Working Papers 0329, National Bureau of Economic Research, Inc.
    21. Emir Kamenica, 2019. "Bayesian Persuasion and Information Design," Annual Review of Economics, Annual Reviews, vol. 11(1), pages 249-272, August.
    22. Gneiting, Tilmann & Raftery, Adrian E., 2007. "Strictly Proper Scoring Rules, Prediction, and Estimation," Journal of the American Statistical Association, American Statistical Association, vol. 102, pages 359-378, March.
    23. Isaiah Andrews & Jesse M. Shapiro, 2021. "A Model of Scientific Communication," Econometrica, Econometric Society, vol. 89(5), pages 2117-2142, September.
    24. Alfredo Di Tillio & Marco Ottaviani & Peter Norman Sørensen, 2017. "Persuasion Bias in Science: Can Economics Help?," Economic Journal, Royal Economic Society, vol. 127(605), pages 266-304, October.
    25. Abhijit V. Banerjee & Sylvain Chassang & Sergio Montero & Erik Snowberg, 2020. "A Theory of Experimenters: Robustness, Randomization, and Balance," American Economic Review, American Economic Association, vol. 110(4), pages 1206-1230, April.
    26. Jérôme Mathis, 2008. "Full Revelation of Information in Sender-Receiver Games of Persuasion," Post-Print hal-02445381, HAL.
    27. repec:hal:spmain:info:hdl:2441/1gr6n3t28b94tafji6op8tlqs1 is not listed on IDEAS
    Full references (including those not matched with items on IDEAS)

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Maximilian Kasy & Jann Spiess, 2022. "Rationalizing Pre-Analysis Plans:Statistical Decisions Subject to Implementability," Economics Series Working Papers 975, University of Oxford, Department of Economics.
    2. Maximilian Kasy & Jann Spiess, 2024. "Optimal Pre-Analysis Plans: Statistical Decisions Subject to Implementability," Economics Series Working Papers 1058, University of Oxford, Department of Economics.
    3. Alexander Frankel & Maximilian Kasy, 2022. "Which Findings Should Be Published?," American Economic Journal: Microeconomics, American Economic Association, vol. 14(1), pages 1-38, February.
    4. Davide Viviano & Kaspar Wuthrich & Paul Niehaus, 2021. "A model of multiple hypothesis testing," Papers 2104.13367, arXiv.org, revised Apr 2024.
    5. Felgenhauer, Mike, 2021. "Experimentation and manipulation with preregistration," Games and Economic Behavior, Elsevier, vol. 130(C), pages 400-408.
    6. Abel Brodeur, Nikolai M. Cook, Jonathan S. Hartley, Anthony Heyes, 2022. "Do Pre-Registration and Pre-analysis Plans Reduce p-Hacking and Publication Bias?," LCERPA Working Papers am0132, Laurier Centre for Economic Research and Policy Analysis.
    7. So, Tony, 2020. "Classroom experiments as a replication device," Journal of Behavioral and Experimental Economics (formerly The Journal of Socio-Economics), Elsevier, vol. 86(C).
    8. Drazen, Allan & Dreber, Anna & Ozbay, Erkut Y. & Snowberg, Erik, 2021. "Journal-based replication of experiments: An application to “Being Chosen to Lead”," Journal of Public Economics, Elsevier, vol. 202(C).
    9. Anna Dreber & Magnus Johannesson & Yifan Yang, 2024. "Selective reporting of placebo tests in top economics journals," Economic Inquiry, Western Economic Association International, vol. 62(3), pages 921-932, July.
    10. Alfredo Di Tillio & Marco Ottaviani & Peter Norman Sørensen, 2021. "Strategic Sample Selection," Econometrica, Econometric Society, vol. 89(2), pages 911-953, March.
    11. Herresthal, Claudia, 2022. "Hidden testing and selective disclosure of evidence," Journal of Economic Theory, Elsevier, vol. 200(C).
    12. Bergemann, Dirk & Ottaviani, Marco, 2021. "Information Markets and Nonmarkets," CEPR Discussion Papers 16459, C.E.P.R. Discussion Papers.
    13. Furukawa, Chishio, 2019. "Publication Bias under Aggregation Frictions: Theory, Evidence, and a New Correction Method," EconStor Preprints 194798, ZBW - Leibniz Information Centre for Economics.
    14. Bruno Ferman & Cristine Pinto & Vitor Possebom, 2020. "Cherry Picking with Synthetic Controls," Journal of Policy Analysis and Management, John Wiley & Sons, Ltd., vol. 39(2), pages 510-532, March.
    15. Stefano DellaVigna & Elizabeth Linos, 2022. "RCTs to Scale: Comprehensive Evidence From Two Nudge Units," Econometrica, Econometric Society, vol. 90(1), pages 81-116, January.
    16. Pedro Carneiro & Sokbae Lee & Daniel Wilhelm, 2020. "Optimal data collection for randomized control trials," The Econometrics Journal, Royal Economic Society, vol. 23(1), pages 1-31.
    17. Maurizio Canavari & Andreas C. Drichoutis & Jayson L. Lusk & Rodolfo M. Nayga, Jr., 2018. "How to run an experimental auction: A review of recent advances," Working Papers 2018-5, Agricultural University of Athens, Department Of Agricultural Economics.
    18. James R. Stevenson & Karen Macours & Douglas Gollin, 2023. "The Rigor Revolution: New Standards of Evidence for Impact Assessment of International Agricultural Research," Annual Review of Resource Economics, Annual Reviews, vol. 15(1), pages 495-515, October.
    19. Abel Brodeur & Scott Carrell & David Figlio & Lester Lusher, 2023. "Unpacking P-hacking and Publication Bias," American Economic Review, American Economic Association, vol. 113(11), pages 2974-3002, November.
    20. Eszter Czibor & David Jimenez‐Gomez & John A. List, 2019. "The Dozen Things Experimental Economists Should Do (More of)," Southern Economic Journal, John Wiley & Sons, vol. 86(2), pages 371-432, October.

    More about this item

    Keywords

    pre-analysis plans; statistical decisions; implementability;
    All these keywords.

    JEL classification:

    • C18 - Mathematical and Quantitative Methods - - Econometric and Statistical Methods and Methodology: General - - - Methodolical Issues: General
    • D8 - Microeconomics - - Information, Knowledge, and Uncertainty
    • I23 - Health, Education, and Welfare - - Education - - - Higher Education; Research Institutions

    NEP fields

    This paper has been announced in the following NEP Reports:

    Statistics

    Access and download statistics

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:iza:izadps:dp17187. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: Holger Hinte (email available below). General contact details of provider: https://edirc.repec.org/data/izaaade.html .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.